Results 1  10
of
27
Networks of Spiking Neurons: The Third Generation of Neural Network Models
 Neural Networks
, 1997
"... The computational power of formal models for networks of spiking neurons is compared with that of other neural network models based on McCulloch Pitts neurons (i.e. threshold gates) respectively sigmoidal gates. In particular it is shown that networks of spiking neurons are computationally more powe ..."
Abstract

Cited by 182 (16 self)
 Add to MetaCart
(Show Context)
The computational power of formal models for networks of spiking neurons is compared with that of other neural network models based on McCulloch Pitts neurons (i.e. threshold gates) respectively sigmoidal gates. In particular it is shown that networks of spiking neurons are computationally more powerful than these other neural network models. A concrete biologically relevant function is exhibited which can be computed by a single spiking neuron (for biologically reasonable values of its parameters), but which requires hundreds of hidden units on a sigmoidal neural net. This article does not assume prior knowledge about spiking neurons, and it contains an extensive list of references to the currently available literature on computations in networks of spiking neurons and relevant results from neurobiology. 1 Definitions and Motivations If one classifies neural network models according to their computational units, one can distinguish three different generations. The first generation i...
Lower Bounds for the Computational Power of Networks of Spiking Neurons
 Neural Computation
, 1995
"... We investigate the computational power of a formal model for networks of spiking neurons. It is shown that simple operations on phasedifferences between spiketrains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a small network o ..."
Abstract

Cited by 65 (18 self)
 Add to MetaCart
We investigate the computational power of a formal model for networks of spiking neurons. It is shown that simple operations on phasedifferences between spiketrains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a small network of spiking neurons. We construct networks of spiking neurons that simulate arbitrary threshold circuits, Turing machines, and a certain type of random access machines with real valued inputs. We also show that relatively weak basic assumptions about the response and thresholdfunctions of the spiking neurons are sufficient in order to employ them for such computations. 1 Introduction and Basic Definitions There exists substantial evidence that timing phenomena such as temporal differences between spikes and frequencies of oscillating subsystems are integral parts of various information processing mechanisms in biological neural systems (for a survey and references see e.g. Kandel et al., ...
On the computational complexity of networks of spiking neurons
 Advances in Neural Information Processing Systems
, 1995
"... 2 Abstract We investigate the computational power of a formal model for networks of spiking neurons. It is shown that simple operations on phasedifferences between spiketrains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a sma ..."
Abstract

Cited by 24 (12 self)
 Add to MetaCart
2 Abstract We investigate the computational power of a formal model for networks of spiking neurons. It is shown that simple operations on phasedifferences between spiketrains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a small network of spiking neurons. We construct networks of spiking neurons that simulate arbitrary threshold circuits, Turing machines, and a certain type of random access machines with real valued inputs. We also show that relatively weak basic assumptions about the response and thresholdfunctions of the spiking neurons are sufficient in order to employ them for such computations. Furthermore we prove upper bounds for the computational power of networks of spiking neurons with arbitrary piecewise linear responseand thresholdfunctions, and show that they are with regard to realtime simulations computationally equivalent to a certain type of random access machine, and to recurrent analog neural nets with piecewise linear activation functions. In addition we give corresponding results for networks of spiking neurons with a limited timing precision, and we prove upper and lower bounds for the VCdimension and pseudodimension of networks of spiking neurons. 3 1
SelfOrganization of Spiking Neurons Using Action Potential Timing
, 1998
"... We propose a mechanism for unsupervised learning in networks of spiking neurons which is based on the timing of single firing events. Our results show that a topology preserving behaviour quite similar to that of Kohonen's selforganizing map can be achieved using temporal coding. In contrast t ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
We propose a mechanism for unsupervised learning in networks of spiking neurons which is based on the timing of single firing events. Our results show that a topology preserving behaviour quite similar to that of Kohonen's selforganizing map can be achieved using temporal coding. In contrast to previous approaches, which use rate coding, the winner among competing neurons can be determined fast and locally. Our model is a further step towards a more realistic description of unsupervised learning in biological neural systems. Furthermore, it may provide a basis for fast implementations in pulsed VLSI. Keywords Selforganizing map, spiking neurons, temporal coding, unsupervised learning. I. Introduction In the area of modelling information processing in biological neural systems, there is an ongoing debate about which essentials have to be taken into account (see e.g. [1], [2], [3], [4]). Discrete models, such as threshold gates or McCullochPitts neurons, are undoubtedly very simplis...
On the relevance of time in neural computation and learning
 Proc. of the 8th Internation Conference on Algorithmic Learning Theory, Sendai (Japan), in ‘‘Springer Lecture Notes in Computer Science’’ (Ming Li and Akira Maruoka, Eds
, 1997
"... ..."
(Show Context)
On computation with pulses
 Information and Computation
, 1999
"... We explore the computational power of formal models for computation with pulses. Such models are motivated by realistic models for biological neurons, and by related new types of VLSI (\pulse stream VLSI"). In preceding work it was shown that the computational power of formal models for com ..."
Abstract

Cited by 15 (1 self)
 Add to MetaCart
We explore the computational power of formal models for computation with pulses. Such models are motivated by realistic models for biological neurons, and by related new types of VLSI (\pulse stream VLSI&quot;). In preceding work it was shown that the computational power of formal models for computation with pulses is quite high if the pulses arriving at a computational unit have an approximately linearly rising or linearly decreasing initial segment. This property is satis ed by common models for biological neurons. On the other hand several implementations of pulse stream VLSI employ pulses that are approximately piecewise constant (i.e. step functions). In this article we investigate the relevance of the shape of pulses in formal models for computation with pulses. It turns out that the computational power drops signi cantly if one replaces pulses with linearly rising or decreasing initial segments by piecewise constant pulses. We provide an exact characterization of the latter model in terms of a weak version of a random access machine (RAM). We also compare the language recognition capability of a recurrent version of this model with that of deterministic nite automata and Turing machines. 1
Learning Temporally Encoded Patterns in Networks of Spiking Neurons
 NEURAL PROCESSING LETTERS
, 1997
"... Networks of spiking neurons are very powerful and versatile models for biological and artificial information processing systems. Especially for modelling pattern analysis tasks in a biologically plausible way that require short response times with high precision they seem to be more appropriate than ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
(Show Context)
Networks of spiking neurons are very powerful and versatile models for biological and artificial information processing systems. Especially for modelling pattern analysis tasks in a biologically plausible way that require short response times with high precision they seem to be more appropriate than networks of threshold gates or models that encode analog values in average firing rates. We investigate the question how neurons can learn on the basis of time differences between firing times. In particular, we provide learning rules of the Hebbian type in terms of single spiking events of the pre and postsynaptic neuron and show that the weights approach some value given by the difference between pre and postsynaptic firing times with arbitrary high precision.
An Efficient Implementation of Sigmoidal Neural Nets in Temporal Coding with Noisy Spiking Neurons
, 1995
"... We show that networks of relatively realistic mathematical models for biological neurons can in principle simulate arbitrary feedforward sigmoidal neural nets in a way which has previously not been considered. This new approach is based on temporal coding by single spikes (respectively by the timing ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
We show that networks of relatively realistic mathematical models for biological neurons can in principle simulate arbitrary feedforward sigmoidal neural nets in a way which has previously not been considered. This new approach is based on temporal coding by single spikes (respectively by the timing of synchronous firing in pools of neurons), rather than on the traditional interpretation of analog variables in terms of firing rates. The resulting new simulation is substantially faster and hence more consistent with experimental results about the maximal speed of information processing in cortical neural systems. As a consequence we can show that networks of noisy spiking neurons are "universal approximators" in the sense that they can approximate with regard to temporal coding any given continuous function of several variables. This result holds for a fairly large class of schemes for coding analog variables by firing times of spiking neurons. Our new proposal for the possible organiza...
Hebbian SpikeTiming Dependent SelfOrganization in Pulsed Neural Networks
 In Proceedings of World Congress on Neuroinformatics
, 2001
"... We present a mechanism of unsupervised competitive learning and development of topology preserving selforganizing maps of spiking neurons. The information encoding is based on the precise timing of single spike events. The work provides a competitive learning algorithm that is based on the relative ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
We present a mechanism of unsupervised competitive learning and development of topology preserving selforganizing maps of spiking neurons. The information encoding is based on the precise timing of single spike events. The work provides a competitive learning algorithm that is based on the relative timing of the pre and postsynaptic spikes, local synapse competitions within a single neuron and global competition via lateral connections. Furthermore, we present part of the experimental work on the capability of the suggested mechanism to perform topology preserving mapping and competitive learning. The results show that our model covers the main characteristic behaviour of the standard SOM but uses a computationally more powerful timingdependent spike encoding.