Results 1 
6 of
6
Networks of Spiking Neurons: The Third Generation of Neural Network Models
, 1996
"... The computational power of formal models for networks of spiking neurons is compared with that of other neural network models based on McCulloch Pitts neurons (i.e. threshold gates) respectively sigmoidal gates. In particular it is shown that networks of spiking neurons are computationally more powe ..."
Abstract

Cited by 192 (14 self)
 Add to MetaCart
The computational power of formal models for networks of spiking neurons is compared with that of other neural network models based on McCulloch Pitts neurons (i.e. threshold gates) respectively sigmoidal gates. In particular it is shown that networks of spiking neurons are computationally more powerful than these other neural network models. A concrete biologically relevant function is exhibited which can be computed by a single spiking neuron (for biologically reasonable values of its parameters), but which requires hundreds of hidden units on a sigmoidal neural net. This article does not assume prior knowledge about spiking neurons, and it contains an extensive list of references to the currently available literature on computations in networks of spiking neurons and relevant results from neurobiology.
On the Complexity of Computing and Learning with Multiplicative Neural Networks
 NEURAL COMPUTATION
"... In a great variety of neuron models neural inputs are combined using the summing operation. We introduce the concept of multiplicative neural networks that contain units which multiply their inputs instead of summing them and, thus, allow inputs to interact nonlinearly. The class of multiplicative n ..."
Abstract

Cited by 38 (3 self)
 Add to MetaCart
In a great variety of neuron models neural inputs are combined using the summing operation. We introduce the concept of multiplicative neural networks that contain units which multiply their inputs instead of summing them and, thus, allow inputs to interact nonlinearly. The class of multiplicative neural networks comprises such widely known and well studied network types as higherorder networks and product unit networks. We investigate the complexity of computing and learning for multiplicative neural networks. In particular, we derive upper and lower bounds on the VapnikChervonenkis (VC) dimension and the pseudo dimension for various types of networks with multiplicative units. As the most general case, we consider feedforward networks consisting of product and sigmoidal units, showing that their pseudo dimension is bounded from above by a polynomial with the same order of magnitude as the currently best known bound for purely sigmoidal networks. Moreover, we show that this bound holds even in the case when the unit type, product or sigmoidal, may be learned. Crucial for these results are calculations of solution set components bounds for new network classes. As to lower bounds we construct product unit networks of fixed depth with superlinear VC dimension. For sigmoidal networks of higher order we establish polynomial bounds that, in contrast to previous results, do not involve any restriction of the network order. We further consider various classes of higherorder units, also known as sigmapi units, that are characterized by connectivity constraints. In terms of these we derive some asymptotically tight bounds.
Circuits versus Trees in Algebraic Complexity
 In Proc. STACS 2000
, 2000
"... . This survey is devoted to some aspects of the \P = NP ?" problem over the real numbers and more general algebraic structures. We argue that given a structure M , it is important to nd out whether NPM problems can be solved by polynomial depth computation trees, and if so whether these tre ..."
Abstract

Cited by 9 (8 self)
 Add to MetaCart
. This survey is devoted to some aspects of the \P = NP ?" problem over the real numbers and more general algebraic structures. We argue that given a structure M , it is important to nd out whether NPM problems can be solved by polynomial depth computation trees, and if so whether these trees can be eciently simulated by circuits. Point location, a problem of computational geometry, comes into play in the study of these questions for several structures of interest. 1 Introduction In algebraic complexity one measures the complexity of an algorithm by the number of basic operations performed during a computation. The basic operations are usually arithmetic operations and comparisons, but sometimes transcendental functions are also allowed [2123, 26]. Even when the set of basic operations has been xed, the complexity of a problem depends on the particular model of computation considered. The two main categories of interest for this paper are circuits and trees. In section 2 and...
Lower Bounds on the Complexity of Approximating Continuous Functions by Sigmoidal Neural Networks
, 2000
"... We calculate lower bounds on the size of sigmoidal neural networks that approximate continuous functions. In particular, we show that for the approximation of polynomials the network size has to grow as\Omega\Gamma/357 k) ) where k is the degree of the polynomials. This bound is valid for any ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
We calculate lower bounds on the size of sigmoidal neural networks that approximate continuous functions. In particular, we show that for the approximation of polynomials the network size has to grow as\Omega\Gamma/357 k) ) where k is the degree of the polynomials. This bound is valid for any input dimension, i.e. independently of the number of variables. The result is obtained by introducing a new method employing upper bounds on the VapnikChervonenkis dimension for proving lower bounds on the size of networks that approximate continuous functions.