Results 1  10
of
19
Computational Complexity Of Neural Networks: A Survey
, 1994
"... . We survey some of the central results in the complexity theory of discrete neural networks, with pointers to the literature. Our main emphasis is on the computational power of various acyclic and cyclic network models, but we also discuss briefly the complexity aspects of synthesizing networks fr ..."
Abstract

Cited by 26 (6 self)
 Add to MetaCart
. We survey some of the central results in the complexity theory of discrete neural networks, with pointers to the literature. Our main emphasis is on the computational power of various acyclic and cyclic network models, but we also discuss briefly the complexity aspects of synthesizing networks from examples of their behavior. CR Classification: F.1.1 [Computation by Abstract Devices]: Models of Computationneural networks, circuits; F.1.3 [Computation by Abstract Devices ]: Complexity Classescomplexity hierarchies Key words: Neural networks, computational complexity, threshold circuits, associative memory 1. Introduction The currently again very active field of computation by "neural" networks has opened up a wealth of fascinating research topics in the computational complexity analysis of the models considered. While much of the general appeal of the field stems not so much from new computational possibilities, but from the possibility of "learning", or synthesizing networks...
Neural Networks and Complexity Theory
 In Proc. 17th International Symposium on Mathematical Foundations of Computer Science
, 1992
"... . We survey some of the central results in the complexity theory of discrete neural networks, with pointers to the literature. 1 Introduction The recently revived field of computation by "neural" networks provides the complexity theorist with a wealth of fascinating research topics. Whi ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
(Show Context)
. We survey some of the central results in the complexity theory of discrete neural networks, with pointers to the literature. 1 Introduction The recently revived field of computation by "neural" networks provides the complexity theorist with a wealth of fascinating research topics. While much of the general appeal of the field stems not so much from new computational possibilities, but from the possibility of "learning", or synthesizing networks directly from examples of their desired inputoutput behavior, it is nevertheless important to pay attention also to the complexity issues: firstly, what kinds of functions are computable by networks of a given type and size, and secondly, what is the complexity of the synthesis problems considered. In fact, inattention to these issues was a significant factor in the demise of the first stage of neural networks research in the late 60's, under the criticism of Minsky and Papert [51]. The intent of this paper is to survey some of the centra...
Robustness in regulatory networks: a multidisciplinary approach
 Acta Biotheoretica
, 2008
"... Abstract. We give in this paper indications about the dynamical impact (as phenotypic changes) coming from the main sources of perturbation in biological regulatory networks. First, we define the boundary of the interaction graph expressing the regulations between the main elements of the network ..."
Abstract

Cited by 15 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We give in this paper indications about the dynamical impact (as phenotypic changes) coming from the main sources of perturbation in biological regulatory networks. First, we define the boundary of the interaction graph expressing the regulations between the main elements of the network (genes, proteins, metabolites,...). Then, we search what changes in the state values on the boundary could cause some changes of states in the core of the system (robustness to boundary conditions). After, we analyse the role of the mode of updating (sequential, block sequential or parallel) on the asymptotics of the network, essentially on the occurrence of limit cycles (robustness to updating methods). Finally, we show the influence of some topological changes (e.g. suppression or addition of interactions) on the dynamical behaviour of the system (robustness to topology perturbations).
An Overview Of The Computational Power Of Recurrent Neural Networks
 Proceedings of the 9th Finnish AI Conference STeP 2000{Millennium of AI, Espoo, Finland (Vol. 3: &quot;AI of Tomorrow&quot;: Symposium on Theory, Finnish AI Society
, 2000
"... INTRODUCTION The two main streams of neural networks research consider neural networks either as a powerful family of nonlinear statistical models, to be used in for example pattern recognition applications [6], or as formal models to help develop a computational understanding of the brain [10]. His ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
(Show Context)
INTRODUCTION The two main streams of neural networks research consider neural networks either as a powerful family of nonlinear statistical models, to be used in for example pattern recognition applications [6], or as formal models to help develop a computational understanding of the brain [10]. Historically, the brain theory interest was primary [32], but with the advances in computer technology, the application potential of the statistical modeling techniques has shifted the balance. 1 The study of neural networks as general computational devices does not strictly follow this division of interests: rather, it provides a general framework outlining the limitations and possibilities aecting both research domains. The prime historic example here is obviously Minsky's and Papert's 1969 study of the computational limitations of singlelayer perceptrons [34], which was a major inuence in turning away interest from neural network learning to symbolic AI techniques for more
The Computational Power of Discrete Hopfield Nets with Hidden Units
 Neural Computation
, 1996
"... We prove that polynomial size discrete Hopfield networks with hidden units compute exactly the class of Boolean functions PSPACE/poly, i.e., the same functions as are computed by polynomial spacebounded nonuniform Turing machines. As a corollary to the construction, we observe also that networks wi ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
(Show Context)
We prove that polynomial size discrete Hopfield networks with hidden units compute exactly the class of Boolean functions PSPACE/poly, i.e., the same functions as are computed by polynomial spacebounded nonuniform Turing machines. As a corollary to the construction, we observe also that networks with polynomially bounded interconnection weights compute exactly the class of functions P/poly, i.e., the class computed by polynomial timebounded nonuniform Turing machines.
On the Computational Power of Discrete Hopfield Nets
 In: Proc. 20th International Colloquium on Automata, Languages, and Programming
, 1993
"... . We prove that polynomial size discrete synchronous Hopfield networks with hidden units compute exactly the class of Boolean functions PSPACE/poly, i.e., the same functions as are computed by polynomial spacebounded nonuniform Turing machines. As a corollary to the construction, we observe also th ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
. We prove that polynomial size discrete synchronous Hopfield networks with hidden units compute exactly the class of Boolean functions PSPACE/poly, i.e., the same functions as are computed by polynomial spacebounded nonuniform Turing machines. As a corollary to the construction, we observe also that networks with polynomially bounded interconnection weights compute exactly the class of functions P/poly. 1 Background Recurrent, or cyclic, neural networks are an intriguing model of massively parallel computation. In the recent surge of research in neural computation, such networks have been considered mostly from the point of view of two types of applications: pattern classification and associative memory (e.g. [16, 18, 21, 24]), and combinatorial optimization (e.g. [1, 7, 20]). Nevertheless, recurrent networks are capable also of more general types of computation, and issues of what exactly such networks can compute, and how they should be programmed, are becoming increasingly topica...
Absence of Cycles in Symmetric Neural Networks
 Advances in Neural Information Processing Systems (NIPS) 8
, 1995
"... For a given recurrent neural network, a discretetime model may have asymptotic dynamics different from the one of a related continuoustime model. In this paper, we consider a discretetime model that discretizes the continuoustime leaky integrator model and study its parallel and sequential dynam ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
For a given recurrent neural network, a discretetime model may have asymptotic dynamics different from the one of a related continuoustime model. In this paper, we consider a discretetime model that discretizes the continuoustime leaky integrator model and study its parallel and sequential dynamics for symmetric networks. We provide sufficient (and necessary in many cases) conditions for the discretized model to have the same cyclefree dynamics of the corresponding continuoustime model in symmetric networks. 1 INTRODUCTION For an nneuron recurrent network, a muchstudied and widelyused continuoustime (CT) model is the leaky integrator model (Hertz, et al., 1991; Hopfield, 1984), given by a system of nonlinear differential equations: ø i dx i dt = \Gammax i + oe i ( n X j=1 w ij x j + I i ); t 0; i = 1; :::; n; (1) and a related discretetime (DT) version is the sigmoidal model (Hopfield, 1982; Marcus & Westervelt, 1989), specified by a system of nonlinear difference e...
Parametric Analysis of Weighted Order Statistic Filters
 Proceedings IEEE International Symposium on Circuits and Systems
, 1994
"... AbstractIn this letter, we shall study the convergence properties of weighted order statistics filters. Based on a set of parameters, weighted order statistics filters are divided into five categories making their convergence properties easily understood. It will be shown that any symmetric weighte ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
AbstractIn this letter, we shall study the convergence properties of weighted order statistics filters. Based on a set of parameters, weighted order statistics filters are divided into five categories making their convergence properties easily understood. It will be shown that any symmetric weighted order statistics filters will make the input sequence converge to a root or oscillate in a cycle of period 2. This result is significant since a restriction imposed by an earlier research is eliminated making the result applicable for the whole class of symmetric weighted order statistics filters. A condition to guarantee convergence of symmetric weighted order statistics filters will be derived. I.
The Stability of Asymmetric Hopfield Networks With Nonnegative Weights
 THEORETICAL ASPECTS OF NEURAL COMPUTATION: A MULTIDISCIPLINARY PERSPECTIVE, PROCEEDINGS OF HONG KONG INTERNATIONAL WORKSHOP (TANC’97
, 1997
"... We have proved that an asymmetric Hopfield network with nonnegative weights is strictly stable, that is, the network almost evolves to a stable asynchronousmode shows that the asymmetric Hopfield network with nonnegative weights has almost the same stability in randomly asynchronous mode as a Hopfi ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We have proved that an asymmetric Hopfield network with nonnegative weights is strictly stable, that is, the network almost evolves to a stable asynchronousmode shows that the asymmetric Hopfield network with nonnegative weights has almost the same stability in randomly asynchronous mode as a Hopfield network does, which is significant to associative memory on asymmetric Hopfield networks.
On the Computational Complexity of Binary and Analog Symmetric Hopfield Nets
"... We investigate the computational properties of finite binary and analogstate discretetime symmetric Hopfield nets. For binary networks, we obtain a simulation of convergent asymmetric networks by symmetric networks with only a linear increase in network size and computation time. Then we analyze t ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We investigate the computational properties of finite binary and analogstate discretetime symmetric Hopfield nets. For binary networks, we obtain a simulation of convergent asymmetric networks by symmetric networks with only a linear increase in network size and computation time. Then we analyze the convergence time of Hopfield nets in terms of the length of their bit representations. Here we construct an analog symmetric network whose convergence time exceeds the convergence time of any binary Hopfield net with the same representation length. Further, we prove that the MIN ENERGY problem for analog Hopfield nets is NPhard, and provide a polynomial time approximation algorithm for this problem in the case of binary nets. Finally, we show that symmetric analog nets with an external clock are computationally Turing universal.