Results 1  10
of
14
A Lagrangian Relaxation Network for Graph Matching
 IEEE Trans. Neural Networks
, 1996
"... A Lagrangian relaxation network for graph matching is presented. The problem is formulated as follows: given graphs G and g, find a permutation matrix M that brings the two sets of vertices into correspondence. Permutation matrix constraints are formulated in the framework of deterministic annealing ..."
Abstract

Cited by 26 (7 self)
 Add to MetaCart
A Lagrangian relaxation network for graph matching is presented. The problem is formulated as follows: given graphs G and g, find a permutation matrix M that brings the two sets of vertices into correspondence. Permutation matrix constraints are formulated in the framework of deterministic annealing. Our approach is in the same spirit as a Lagrangian decomposition approach in that the row and column constraints are satisfied separately with a Lagrange multiplier used to equate the two "solutions." Due to the unavoidable symmetries in graph isomorphism (resulting in multiple global minima), we add a symmetrybreaking selfamplification term in order to obtain a permutation matrix. With the application of a fixpoint preserving algebraic transformation to both the distance measure and selfamplification terms, we obtain a Lagrangian relaxation network. The network performs minimization with respect to the Lagrange parameters and maximization with respect to the permutation matrix variable...
Algebraic Transformations of Objective Functions
 Neural Networks
, 1994
"... Many neural networks can be derived as optimization dynamics for suitable objective functions. We show that such networks can be designed by repeated transformations of one objective into another with the same fixpoints. We exhibit a collection of algebraic transformations which reduce network cost ..."
Abstract

Cited by 26 (11 self)
 Add to MetaCart
Many neural networks can be derived as optimization dynamics for suitable objective functions. We show that such networks can be designed by repeated transformations of one objective into another with the same fixpoints. We exhibit a collection of algebraic transformations which reduce network cost and increase the set of objective functions that are neurally implementable. The transformations include simplification of products of expressions, functions of one or two expressions, and sparse matrix products (all of which may be interpreted as Legendre transformations); also the minimum and maximum of a set of expressions. These transformations introduce new interneurons which force the network to seek a saddle point rather than a minimum. Other transformations allow control of the network dynamics, by reconciling the Lagrangian formalism with the need for fixpoints. We apply the transformations to simplify a number of structured neural networks, beginning with the standard reduction of...
Untimed and Misrepresented: Connectionism and the Computer Metaphor
, 1992
"... The computer metaphor for the mind or brain has long outlived its usefulness, being based on Cartesian ideas. Connectionism has not broken free from this metaphor, and this has stunted the directions connectionist research has taken. The subordinate role of timing in computations has resulted in net ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
The computer metaphor for the mind or brain has long outlived its usefulness, being based on Cartesian ideas. Connectionism has not broken free from this metaphor, and this has stunted the directions connectionist research has taken. The subordinate role of timing in computations has resulted in networks with realvalue timelags on signals passing between nodes being ignored. The notion of representation in connectionism is generally confused; this can be clarified when at all times it is made explicit who or what Q and S are in the formula "P is used by Q to represent R to S". Frequently they may be layers or modules within a network, but the typical confusion is symptomatic of the computer metaphor which in practice favours feedforward and militates against arbitrarily connected networks. Rejecting this metaphor, an alternative paradigm is suggested of a brain as a complex dynamical system; investigating the dynamics of arbitrarily connected networks with realvalued timelags, speci...
Bayesian inference on visual grammars by neural nets that optimize
 YALE COMPUTER SCIENCE DEPARTMENT
, 1991
"... We exhibit a systematic way to derive neural nets for vision problems. It involves formulating a vision problem as Bayesian inference or decision on a comprehensive model of the visual domain given by a probabilistic grammar. A key feature of this grammar is the way in which it eliminates model inf ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
We exhibit a systematic way to derive neural nets for vision problems. It involves formulating a vision problem as Bayesian inference or decision on a comprehensive model of the visual domain given by a probabilistic grammar. A key feature of this grammar is the way in which it eliminates model information, such as object labels, as it produces an image; correspondence problems and other noise removal tasks result. The neural nets that arise most directly are generalized assignment networks. Also there are transformations which naturally yield improved algorithms such as correlation matching in scale space and the Frameville neural nets for highlevel vision. Networks derived this way generally have objective functions with spurious local minima; such minima may commonly be avoided by dynamics that include deterministic annealing, for example recent improvements to Mean Field Theory dynamics. The grammatical method of neural net design allows domain knowledge to enter from all levels of the grammar, including "abstract" levels remote from the final image data, and
Neural Mechanisms For SelfOrganization Of Emergent Schemata, Dynamical Schema Processing, And Semantic Constraint Satisfaction
, 1993
"... . The concept of schema and some general characteristics of models using schemata are discussed. It is shown by computer simulations how a combination of a number of simple neural circuits are capable of performing actions similar to those commonly attributed to schemata, especially selforganizatio ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
. The concept of schema and some general characteristics of models using schemata are discussed. It is shown by computer simulations how a combination of a number of simple neural circuits are capable of performing actions similar to those commonly attributed to schemata, especially selforganization of a representational code, recognition of spatial and temporal structure, adaptive performance and semantic constraint satisfaction. 1. INTRODUCTION What are the basic units of cognitive processing? The so called Classical Theories (Fodor & Pylyshyn, 1988) argue that these units are symbols together with symbolic processes. On the other hand, the Connectionist School argues that we should approach cognition at another level and study how neuronlike elements interact to produce collective effects (Rumelhart et. al. 1986). Both camps seem to assume that the other is totally wrong. Yet, there is no doubt that human behaviour exhibits examples of both the symbol processing capabilities of t...
Composition
"... this paper will be the difficulty in distinguishing the inside of an entity from its outside. ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
this paper will be the difficulty in distinguishing the inside of an entity from its outside.
Symbolic neural networks derived from stochastic grammar domain models
 Connectionist Symbolic Integration. Lawrence Erlbaum Associates
, 1995
"... Starting with a statistical domain model in the form of a stochastic grammar, one can derive neural network architectures with some of the expressive power of a semantic network and also some of the pattern recognition and learning capabilities of more conventional neural networks. For example in th ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Starting with a statistical domain model in the form of a stochastic grammar, one can derive neural network architectures with some of the expressive power of a semantic network and also some of the pattern recognition and learning capabilities of more conventional neural networks. For example in this paper a version of the “Frameville ” architecture, and in particular its objective function and constraints, is derived from a stochastic grammar schema. Possible optimization dynamics for this architecture, and relationships to other recent architectures such as Bayesian networks and variablebinding networks, are also discussed. This paper outlines a statistical approach to unifying certain symbolic and neural net architectures, by deriving them from a stochastic domain model with sufficient structure. The domain model is a stochastic Lsystem grammar, whose rules for generating objects and their parts each include a Boltzmann probability distribution. Using such a domain model in highlevel vision, it is possible to formulate object recognition and visual learning problems as constrained optimization problems [16]
Markov Random Fields and Neural Networks with Applications to Early Vision Problems
, 1991
"... The current resurgence of interest in Neural Networks has opened up several basic issues. In this chapter, we explore the connections between this area and Markov Random Fields. We are specifically concerned with early vision problems which have already benefited from a parallel and distributed comp ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
The current resurgence of interest in Neural Networks has opened up several basic issues. In this chapter, we explore the connections between this area and Markov Random Fields. We are specifically concerned with early vision problems which have already benefited from a parallel and distributed computing perspective. We explore the relationships between the two fields at two different levels of a computational approach. Applications highlighting specific instances where ideas from the two approaches intertwine are discussed.
A Macroscopic Model of Oscillation in Ensembles of Inhibitory and Excitatory Neurons
, 1993
"... Very large networks of neurons can be characterized in a tractable and meaningful way by considering the average or ensemble behavior of groups of cells. This paper develops a mathematical model to characterize a homogeneous neural group at a macroscopic level, given a microscopic description of ind ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Very large networks of neurons can be characterized in a tractable and meaningful way by considering the average or ensemble behavior of groups of cells. This paper develops a mathematical model to characterize a homogeneous neural group at a macroscopic level, given a microscopic description of individual cells. This model is then used to study the interaction between two neuron groups. Conditions that lead to oscillatory behavior in both excitatory and inhibitory groups of cells are determined. Using Fourier series analysis, we obtain approximate expressions for the frequency of oscillations of the average input and output activities, and quantitatively relate them to other network parameters. Computer simulation results show these frequency estimations to be quite accurate. KeywordsLarge neural networks, Ensemble behavior, Oscillations, Inhibitory and Excitatory Cell Assemblies, Frequency estimation. 1 Introduction Biological neural networks consist of very large numbers of neu...