Results 1 
6 of
6
Networks of Spiking Neurons: The Third Generation of Neural Network Models
 Neural Networks
, 1997
"... The computational power of formal models for networks of spiking neurons is compared with that of other neural network models based on McCulloch Pitts neurons (i.e. threshold gates) respectively sigmoidal gates. In particular it is shown that networks of spiking neurons are computationally more powe ..."
Abstract

Cited by 138 (12 self)
 Add to MetaCart
The computational power of formal models for networks of spiking neurons is compared with that of other neural network models based on McCulloch Pitts neurons (i.e. threshold gates) respectively sigmoidal gates. In particular it is shown that networks of spiking neurons are computationally more powerful than these other neural network models. A concrete biologically relevant function is exhibited which can be computed by a single spiking neuron (for biologically reasonable values of its parameters), but which requires hundreds of hidden units on a sigmoidal neural net. This article does not assume prior knowledge about spiking neurons, and it contains an extensive list of references to the currently available literature on computations in networks of spiking neurons and relevant results from neurobiology. 1 Definitions and Motivations If one classifies neural network models according to their computational units, one can distinguish three different generations. The first generation i...
Lower Bounds for the Computational Power of Networks of Spiking Neurons
 Neural Computation
, 1995
"... We investigate the computational power of a formal model for networks of spiking neurons. It is shown that simple operations on phasedifferences between spiketrains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a small network o ..."
Abstract

Cited by 53 (11 self)
 Add to MetaCart
We investigate the computational power of a formal model for networks of spiking neurons. It is shown that simple operations on phasedifferences between spiketrains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a small network of spiking neurons. We construct networks of spiking neurons that simulate arbitrary threshold circuits, Turing machines, and a certain type of random access machines with real valued inputs. We also show that relatively weak basic assumptions about the response and thresholdfunctions of the spiking neurons are sufficient in order to employ them for such computations. 1 Introduction and Basic Definitions There exists substantial evidence that timing phenomena such as temporal differences between spikes and frequencies of oscillating subsystems are integral parts of various information processing mechanisms in biological neural systems (for a survey and references see e.g. Kandel et al., ...
On the computational complexity of networks of spiking neurons
 Advances in Neural Information Processing Systems
, 1995
"... 2 Abstract We investigate the computational power of a formal model for networks of spiking neurons. It is shown that simple operations on phasedifferences between spiketrains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a sma ..."
Abstract

Cited by 18 (7 self)
 Add to MetaCart
2 Abstract We investigate the computational power of a formal model for networks of spiking neurons. It is shown that simple operations on phasedifferences between spiketrains provide a very powerful computational tool that can in principle be used to carry out highly complex computations on a small network of spiking neurons. We construct networks of spiking neurons that simulate arbitrary threshold circuits, Turing machines, and a certain type of random access machines with real valued inputs. We also show that relatively weak basic assumptions about the response and thresholdfunctions of the spiking neurons are sufficient in order to employ them for such computations. Furthermore we prove upper bounds for the computational power of networks of spiking neurons with arbitrary piecewise linear responseand thresholdfunctions, and show that they are with regard to realtime simulations computationally equivalent to a certain type of random access machine, and to recurrent analog neural nets with piecewise linear activation functions. In addition we give corresponding results for networks of spiking neurons with a limited timing precision, and we prove upper and lower bounds for the VCdimension and pseudodimension of networks of spiking neurons. 3 1
On the Relevance of the Shape of Postsynaptic Potentials for the Computational Power of Spiking Neurons
 Proc. of the International Conference on Artificial Neural Networks (ICANN
, 1995
"... The firing of a neuron in a biological neural system causes in certain other neurons excitatory postsynaptic potential changes (EPSP's) that are not "rectangular", but have the form of a smooth hill. We prove in this article for a formal model of a network of spiking neurons, that the rising respect ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
The firing of a neuron in a biological neural system causes in certain other neurons excitatory postsynaptic potential changes (EPSP's) that are not "rectangular", but have the form of a smooth hill. We prove in this article for a formal model of a network of spiking neurons, that the rising respectively declining segments of these EPSP's are in fact essential for the computational power of the model. 1 Introduction Apparently all computations in biological neural systems are realized through sequences of firings of neurons as a result of incoming postsynaptic potentials, see e.g. (Kandel et al., 1991). Each firing of a neuron in a biological neural system causes excitatory or inhibitory postsynaptic potentials (EPSP's respectively IPSP's) in those other neurons to which it is connected by synapses. A neuron fires if the sum of its incoming postsynaptic potentials becomes larger than its current threshold (which depends on the time of its last previous firing) . Recently one has also ...
The Computational Power of Spiking Neurons Depends on the Shape of the Postsynaptic Potentials
, 1996
"... Recently one has started to investigate the computational power of spiking neurons (also called "integrate and fire neurons"). These are neuron models that are substantially more realistic from the biological point of view than the ones which are traditionally employed in artificial neural nets. ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Recently one has started to investigate the computational power of spiking neurons (also called "integrate and fire neurons"). These are neuron models that are substantially more realistic from the biological point of view than the ones which are traditionally employed in artificial neural nets. It has turned out that the computational power of networks of spiking neurons is quite large. In particular they have the ability to communicate and manipulate analog variables in spatiotemporal coding, i.e. encoded in the time points when specific neurons "fire" (and thus send a "spike" to other neurons). These preceding results have motivated the question which details of the firing mechanism of spiking neurons are essential for their computational power, and which details are "accidental" aspects of their realization in biological "wetware". Obviously this question becomes important if one wants to capture some of the advantages of computing and learning with spatiotemporal c...
!()+, ./01 23456
, 1995
"... Computing the maximum bichromatic discrepancy is an interesting theoretical problem with important applications in computational learning theory, computational geometry and computer graphics. In this paper we give algorithms to compute the maximum bichromatic discrepancy for simple geometric ranges, ..."
Abstract
 Add to MetaCart
Computing the maximum bichromatic discrepancy is an interesting theoretical problem with important applications in computational learning theory, computational geometry and computer graphics. In this paper we give algorithms to compute the maximum bichromatic discrepancy for simple geometric ranges, including rectangles and halfspaces. In addition, we give extensions to other discrepancy problems. 1 Introduction The main theme of this paper is to present efficient algorithms that solve the problem of computing the maximum bichromatic discrepancy for axis oriented rectangles. This problem arises naturally in different areas of computer science, such as computational learning theory, computational geometry and computer graphics ([Ma], [DG]), and has applications in all these areas. In computational learning theory, the problem of agnostic PAClearning with simple geometric hypotheses can be reduced to the problem of computing the maximum bichromatic discrepancy for simple geometric ra...