Results 1  10
of
98
Fast Sigmoidal Networks via Spiking Neurons
 Neural Computation
, 1997
"... We show that networks of relatively realistic mathematical models for biological neurons can in principle simulate arbitrary feedforward sigmoidal neural nets in a way which has previously not been considered. This new approach is based on temporal coding by single spikes (respectively by the timing ..."
Abstract

Cited by 52 (8 self)
 Add to MetaCart
We show that networks of relatively realistic mathematical models for biological neurons can in principle simulate arbitrary feedforward sigmoidal neural nets in a way which has previously not been considered. This new approach is based on temporal coding by single spikes (respectively by the timing of synchronous firing in pools of neurons), rather than on the traditional interpretation of analog variables in terms of firing rates. The resulting new simulation is substantially faster and hence more consistent with experimental results about the maximal speed of information processing in cortical neural systems. As a consequence we can show that networks of noisy spiking neurons are "universal approximators" in the sense that they can approximate with regard to temporal coding any given continuous function of several variables. This result holds for a fairly large class of schemes for coding analog variables by firing times of spiking neurons. Our new proposal for the possible organiza...
Spatial and Temporal Pattern Analysis via Spiking Neurons
 Network: Computation in Neural Systems
, 1998
"... . Spiking neurons, receiving temporally encoded inputs, can compute radial basis functions (RBFs) by storing the relevant information in their delays. In this paper we show how these delays can be learned using exclusively locally available information (basically the time difference between the pre ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
. Spiking neurons, receiving temporally encoded inputs, can compute radial basis functions (RBFs) by storing the relevant information in their delays. In this paper we show how these delays can be learned using exclusively locally available information (basically the time difference between the pre and postsynaptic spike). Our approach gives rise to a biologically plausible algorithm for finding clusters in a high dimensional input space with networks of spiking neurons, even if the environment is changing dynamically. Furthermore, we show that our learning mechanism makes it possible that such RBF neurons can perform some kind of feature extraction where they recognize that only certain input coordinates carry relevant information. Finally we demonstrate that this model allows the recognition of temporal sequences even if they are distorted in various ways. 1. Introduction Radial basis functions (RBFs) have turned out to be among the most powerful artificial neural network types, e....
Paradigms for Computing with Spiking Neurons
, 1999
"... this technical difficulty by considering for example in a simplified setting only correlation variables ..."
Abstract

Cited by 42 (1 self)
 Add to MetaCart
this technical difficulty by considering for example in a simplified setting only correlation variables
Weakly pulsecoupled oscillators, FM interactions, synchronization, and oscillatory associative memory
 IEEE Trans. Neural Networks
, 1999
"... Abstract—We study pulsecoupled neural networks that satisfy only two assumptions: each isolated neuron fires periodically, and the neurons are weakly connected. Each such network can be transformed by a piecewise continuous change of variables into a phase model, whose synchronization behavior and ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
Abstract—We study pulsecoupled neural networks that satisfy only two assumptions: each isolated neuron fires periodically, and the neurons are weakly connected. Each such network can be transformed by a piecewise continuous change of variables into a phase model, whose synchronization behavior and oscillatory associative properties are easier to analyze and understand. Using the phase model, we can predict whether a given pulsecoupled network has oscillatory associative memory, or what minimal adjustments should be made so that it can acquire memory. In the search for such minimal adjustments we obtain a large class of simple pulsecoupled neural networks that can memorize and reproduce synchronized temporal patterns the same way a Hopfield network does with static patterns. The learning occurs via modification of synaptic weights and/or synaptic transmission delays. Index Terms — Canonical models, Class 1 neural excitability, integrateandfire neurons, multiplexing, synfire chain, transmission delay. I.
SelfOrganization of Spiking Neurons Using Action Potential Timing
, 1998
"... We propose a mechanism for unsupervised learning in networks of spiking neurons which is based on the timing of single firing events. Our results show that a topology preserving behaviour quite similar to that of Kohonen's selforganizing map can be achieved using temporal coding. In contrast to pre ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
We propose a mechanism for unsupervised learning in networks of spiking neurons which is based on the timing of single firing events. Our results show that a topology preserving behaviour quite similar to that of Kohonen's selforganizing map can be achieved using temporal coding. In contrast to previous approaches, which use rate coding, the winner among competing neurons can be determined fast and locally. Our model is a further step towards a more realistic description of unsupervised learning in biological neural systems. Furthermore, it may provide a basis for fast implementations in pulsed VLSI. Keywords Selforganizing map, spiking neurons, temporal coding, unsupervised learning. I. Introduction In the area of modelling information processing in biological neural systems, there is an ongoing debate about which essentials have to be taken into account (see e.g. [1], [2], [3], [4]). Discrete models, such as threshold gates or McCullochPitts neurons, are undoubtedly very simplis...
Dynamic Stochastic Synapses as Computational Units
, 1999
"... In most neural network models, synapses are treated as static weights that change only on the slow time scales of learning. It is well known, however, that synapses are highly dynamic, and show usedependent plasticity over a wide range of time scales. Moreover, synaptic transmission is an inhere ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
In most neural network models, synapses are treated as static weights that change only on the slow time scales of learning. It is well known, however, that synapses are highly dynamic, and show usedependent plasticity over a wide range of time scales. Moreover, synaptic transmission is an inherently stochastic process: a spike arriving at a presynaptic terminal triggers release of a vesicle of neurotransmitter from a release site with a probability that can be much less than one. We consider
On the Complexity of Learning for a Spiking Neuron
, 1997
"... ) Wolfgang Maass and Michael Schmitt Abstract Spiking neurons are models for the computational units in biological neural systems where information is considered to be encoded mainly in the temporal patterns of their activity. They provide a way of analyzing neural computation that is not captu ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
) Wolfgang Maass and Michael Schmitt Abstract Spiking neurons are models for the computational units in biological neural systems where information is considered to be encoded mainly in the temporal patterns of their activity. They provide a way of analyzing neural computation that is not captured by the traditional neuron models such as sigmoidal and threshold gates (or "Perceptrons"). We introduce a simple model of a spiking neuron that, in addition to the weights that model the plasticity of synaptic strength, also has variable transmission delays between neurons as programmable parameters. For coding of input and output values two modes are taken into account: binary coding for the Boolean and analog coding for the realvalued domain. We investigate the complexity of learning for a single spiking neuron within the framework of PAClearnability. With regard to sample complexity, we prove that the VCdimension is \Theta(n log n) and, hence, strictly larger than that of a thresho...
On Computing Boolean Functions by a Spiking Neuron
 Annals of Mathematics and Artificial Intelligence
, 1998
"... Computations by spiking neurons are performed using the timing of action potentials. We investigate the computational power of a simple model for such a spiking neuron in the Boolean domain by comparing it with traditional neuron models such as threshold gates (or McCullochPitts neurons) and sigma ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
Computations by spiking neurons are performed using the timing of action potentials. We investigate the computational power of a simple model for such a spiking neuron in the Boolean domain by comparing it with traditional neuron models such as threshold gates (or McCullochPitts neurons) and sigmapi units (or polynomial threshold gates). In particular, we estimate the number of gates required to simulate a spiking neuron by a disjunction of threshold gates and we establish tight bounds for this threshold number. Furthermore, we analyze the degree of the polynomials that a sigmapi unit must use for the simulation of a spiking neuron. We show that this degree cannot be bounded by any fixed value. Our results give evidence that the use of continuous time as a computational resource endows singlecell models with substantially larger computational capabilities. 1 Introduction Biological neurons communicate by sending spikes among themselves. A spike is a discrete event in continuous ti...
On computation with pulses
 Information and Computation
, 1999
"... We explore the computational power of formal models for computation with pulses. Such models are motivated by realistic models for biological neurons, and by related new types of VLSI (\pulse stream VLSI"). In preceding work it was shown that the computational power of formal models for computa ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
We explore the computational power of formal models for computation with pulses. Such models are motivated by realistic models for biological neurons, and by related new types of VLSI (\pulse stream VLSI"). In preceding work it was shown that the computational power of formal models for computation with pulses is quite high if the pulses arriving at a computational unit have an approximately linearly rising or linearly decreasing initial segment. This property is satis ed by common models for biological neurons. On the other hand several implementations of pulse stream VLSI employ pulses that are approximately piecewise constant (i.e. step functions). In this article we investigate the relevance of the shape of pulses in formal models for computation with pulses. It turns out that the computational power drops signi cantly if one replaces pulses with linearly rising or decreasing initial segments by piecewise constant pulses. We provide an exact characterization of the latter model in terms of a weak version of a random access machine (RAM). We also compare the language recognition capability of a recurrent version of this model with that of deterministic nite automata and Turing machines. 1
On the Relevance of Time in Neural Computation and Learning
 In Proceedings of the 8th International Workshop on Algorithmic Learning Theory, ALT’97
, 1997
"... We discuss models for computation in biological neural systems that are based on the current state of knowledge in neurophysiology. Differences and similarities to traditional neural network models are highlighted. It turns out that many important questions regarding computation and learning in biol ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
We discuss models for computation in biological neural systems that are based on the current state of knowledge in neurophysiology. Differences and similarities to traditional neural network models are highlighted. It turns out that many important questions regarding computation and learning in biological neural systems cannot be adequately addressed in traditional neural network models. In particular the role of time is quite different in biologically more realistic models, and many fundamental questions regarding computation and learning have to be rethought for this context. Simultaneously a new generation of VLSIchips is emerging ("pulsed VLSI") where new ideas about computing and learning with temporal coding can be tested. Articles with details and further pointers to the literature can be found at http://www.cis.tugraz.ac.at/igi/maass/ . 1 Introduction An analysis of the role of a gate g in a computation on a familiar computational model, such as a boolean circuit or an artifi...