Results 1  10
of
40
Networks of Spiking Neurons: The Third Generation of Neural Network Models
 Neural Networks
, 1997
"... The computational power of formal models for networks of spiking neurons is compared with that of other neural network models based on McCulloch Pitts neurons (i.e. threshold gates) respectively sigmoidal gates. In particular it is shown that networks of spiking neurons are computationally more powe ..."
Abstract

Cited by 138 (12 self)
 Add to MetaCart
The computational power of formal models for networks of spiking neurons is compared with that of other neural network models based on McCulloch Pitts neurons (i.e. threshold gates) respectively sigmoidal gates. In particular it is shown that networks of spiking neurons are computationally more powerful than these other neural network models. A concrete biologically relevant function is exhibited which can be computed by a single spiking neuron (for biologically reasonable values of its parameters), but which requires hundreds of hidden units on a sigmoidal neural net. This article does not assume prior knowledge about spiking neurons, and it contains an extensive list of references to the currently available literature on computations in networks of spiking neurons and relevant results from neurobiology. 1 Definitions and Motivations If one classifies neural network models according to their computational units, one can distinguish three different generations. The first generation i...
Spatial and Temporal Pattern Analysis via Spiking Neurons
 Network: Computation in Neural Systems
, 1998
"... . Spiking neurons, receiving temporally encoded inputs, can compute radial basis functions (RBFs) by storing the relevant information in their delays. In this paper we show how these delays can be learned using exclusively locally available information (basically the time difference between the pre ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
. Spiking neurons, receiving temporally encoded inputs, can compute radial basis functions (RBFs) by storing the relevant information in their delays. In this paper we show how these delays can be learned using exclusively locally available information (basically the time difference between the pre and postsynaptic spike). Our approach gives rise to a biologically plausible algorithm for finding clusters in a high dimensional input space with networks of spiking neurons, even if the environment is changing dynamically. Furthermore, we show that our learning mechanism makes it possible that such RBF neurons can perform some kind of feature extraction where they recognize that only certain input coordinates carry relevant information. Finally we demonstrate that this model allows the recognition of temporal sequences even if they are distorted in various ways. 1. Introduction Radial basis functions (RBFs) have turned out to be among the most powerful artificial neural network types, e....
Paradigms for Computing with Spiking Neurons
, 1999
"... this technical difficulty by considering for example in a simplified setting only correlation variables ..."
Abstract

Cited by 42 (1 self)
 Add to MetaCart
this technical difficulty by considering for example in a simplified setting only correlation variables
On the computational complexity of networks of spiking neurons
 Advances in Neural Information Processing Systems
, 1995
"... 2 Abstract We investigate the computational power of a formal model for networks of spiking neurons. It is shown that simple operations on phasedifferences between spiketrains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a sma ..."
Abstract

Cited by 18 (7 self)
 Add to MetaCart
2 Abstract We investigate the computational power of a formal model for networks of spiking neurons. It is shown that simple operations on phasedifferences between spiketrains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a small network of spiking neurons. We construct networks of spiking neurons that simulate arbitrary threshold circuits, Turing machines, and a certain type of random access machines with real valued inputs. We also show that relatively weak basic assumptions about the response and thresholdfunctions of the spiking neurons are sufficient in order to employ them for such computations. Furthermore we prove upper bounds for the computational power of networks of spiking neurons with arbitrary piecewise linear responseand thresholdfunctions, and show that they are with regard to realtime simulations computationally equivalent to a certain type of random access machine, and to recurrent analog neural nets with piecewise linear activation functions. In addition we give corresponding results for networks of spiking neurons with a limited timing precision, and we prove upper and lower bounds for the VCdimension and pseudodimension of networks of spiking neurons. 3 1
SelfOrganization of Spiking Neurons Using Action Potential Timing
, 1998
"... We propose a mechanism for unsupervised learning in networks of spiking neurons which is based on the timing of single firing events. Our results show that a topology preserving behaviour quite similar to that of Kohonen's selforganizing map can be achieved using temporal coding. In contrast to pre ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
We propose a mechanism for unsupervised learning in networks of spiking neurons which is based on the timing of single firing events. Our results show that a topology preserving behaviour quite similar to that of Kohonen's selforganizing map can be achieved using temporal coding. In contrast to previous approaches, which use rate coding, the winner among competing neurons can be determined fast and locally. Our model is a further step towards a more realistic description of unsupervised learning in biological neural systems. Furthermore, it may provide a basis for fast implementations in pulsed VLSI. Keywords Selforganizing map, spiking neurons, temporal coding, unsupervised learning. I. Introduction In the area of modelling information processing in biological neural systems, there is an ongoing debate about which essentials have to be taken into account (see e.g. [1], [2], [3], [4]). Discrete models, such as threshold gates or McCullochPitts neurons, are undoubtedly very simplis...
The Neural Network Pushdown Automaton: Model, Stack and Learning Simulations
, 1993
"... In order for neural networks to learn complex languages or grammars, they must have sufficient computational power or resources to recognize or generate such languages. Though many approaches to effectively utilizing the computational power of neural networks have been discussed, an obvious one is t ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
In order for neural networks to learn complex languages or grammars, they must have sufficient computational power or resources to recognize or generate such languages. Though many approaches to effectively utilizing the computational power of neural networks have been discussed, an obvious one is to couple a recurrent neural network with an external stack memory in effect creating a neural network pushdown automata (NNPDA). This NNPDA generalizes the concept of a recurrent network so that the network becomes a more complex computing structure. This paper discusses in detail a NNPDA its construction, how it can be trained and how useful symbolic information can be extracted from the trained network. To effectively couple the external stack to the neural network, an optimization method is developed which uses an error function that connects the learning of the state automaton of the neural network to the learning of the operation of the external stack: push, pop, and nooperation. To minimize the error function using gradient descent learning, an analog stack is designed such that the action and storage of information in the stack are continuous. One interpretation of a continuous stack is the probabilistic storage of and action on data. After training on sample strings of an unknown source grammar, a quantization procedure extracts from the analog stack and neural network a discrete pushdown automata (PDA). Simulations show that in learning deterministic contextfree grammars the balanced parenthesis language, 1 n 0 n, and the deterministic Palindrome the extracted PDA is correct in the sense that it can correctly recognize unseen strings of arbitrary length. In addition, the extracted PDAs can be shown to be identical or equivalent to the PDAs of the source grammars which were used to generate the training strings.
On Computing Boolean Functions by a Spiking Neuron
 Annals of Mathematics and Artificial Intelligence
, 1998
"... Computations by spiking neurons are performed using the timing of action potentials. We investigate the computational power of a simple model for such a spiking neuron in the Boolean domain by comparing it with traditional neuron models such as threshold gates (or McCullochPitts neurons) and sigma ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
Computations by spiking neurons are performed using the timing of action potentials. We investigate the computational power of a simple model for such a spiking neuron in the Boolean domain by comparing it with traditional neuron models such as threshold gates (or McCullochPitts neurons) and sigmapi units (or polynomial threshold gates). In particular, we estimate the number of gates required to simulate a spiking neuron by a disjunction of threshold gates and we establish tight bounds for this threshold number. Furthermore, we analyze the degree of the polynomials that a sigmapi unit must use for the simulation of a spiking neuron. We show that this degree cannot be bounded by any fixed value. Our results give evidence that the use of continuous time as a computational resource endows singlecell models with substantially larger computational capabilities. 1 Introduction Biological neurons communicate by sending spikes among themselves. A spike is a discrete event in continuous ti...
On computation with pulses
 Information and Computation
, 1999
"... We explore the computational power of formal models for computation with pulses. Such models are motivated by realistic models for biological neurons, and by related new types of VLSI (\pulse stream VLSI"). In preceding work it was shown that the computational power of formal models for computa ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
We explore the computational power of formal models for computation with pulses. Such models are motivated by realistic models for biological neurons, and by related new types of VLSI (\pulse stream VLSI"). In preceding work it was shown that the computational power of formal models for computation with pulses is quite high if the pulses arriving at a computational unit have an approximately linearly rising or linearly decreasing initial segment. This property is satis ed by common models for biological neurons. On the other hand several implementations of pulse stream VLSI employ pulses that are approximately piecewise constant (i.e. step functions). In this article we investigate the relevance of the shape of pulses in formal models for computation with pulses. It turns out that the computational power drops signi cantly if one replaces pulses with linearly rising or decreasing initial segments by piecewise constant pulses. We provide an exact characterization of the latter model in terms of a weak version of a random access machine (RAM). We also compare the language recognition capability of a recurrent version of this model with that of deterministic nite automata and Turing machines. 1
On the Relevance of Time in Neural Computation and Learning
 In Proceedings of the 8th International Workshop on Algorithmic Learning Theory, ALT’97
, 1997
"... We discuss models for computation in biological neural systems that are based on the current state of knowledge in neurophysiology. Differences and similarities to traditional neural network models are highlighted. It turns out that many important questions regarding computation and learning in biol ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
We discuss models for computation in biological neural systems that are based on the current state of knowledge in neurophysiology. Differences and similarities to traditional neural network models are highlighted. It turns out that many important questions regarding computation and learning in biological neural systems cannot be adequately addressed in traditional neural network models. In particular the role of time is quite different in biologically more realistic models, and many fundamental questions regarding computation and learning have to be rethought for this context. Simultaneously a new generation of VLSIchips is emerging ("pulsed VLSI") where new ideas about computing and learning with temporal coding can be tested. Articles with details and further pointers to the literature can be found at http://www.cis.tugraz.ac.at/igi/maass/ . 1 Introduction An analysis of the role of a gate g in a computation on a familiar computational model, such as a boolean circuit or an artifi...
Learning Temporally Encoded Patterns in Networks of Spiking Neurons
 NEURAL PROCESSING LETTERS
, 1997
"... Networks of spiking neurons are very powerful and versatile models for biological and artificial information processing systems. Especially for modelling pattern analysis tasks in a biologically plausible way that require short response times with high precision they seem to be more appropriate than ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
Networks of spiking neurons are very powerful and versatile models for biological and artificial information processing systems. Especially for modelling pattern analysis tasks in a biologically plausible way that require short response times with high precision they seem to be more appropriate than networks of threshold gates or models that encode analog values in average firing rates. We investigate the question how neurons can learn on the basis of time differences between firing times. In particular, we provide learning rules of the Hebbian type in terms of single spiking events of the pre and postsynaptic neuron and show that the weights approach some value given by the difference between pre and postsynaptic firing times with arbitrary high precision.