Results 1 
4 of
4
Learning in Compositional Hierarchies: Inducing the Structure of Objects from Data
 In Advances in Neural Information Processing Systems 6
, 1994
"... I propose a learning algorithm for learning hierarchical models for object recognition. The model architecture is a compositional hierarchy that represents partwhole relationships: parts are described in the local context of substructures of the object. The focus of this report is learning hierarch ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
I propose a learning algorithm for learning hierarchical models for object recognition. The model architecture is a compositional hierarchy that represents partwhole relationships: parts are described in the local context of substructures of the object. The focus of this report is learning hierarchical models from data, i.e. inducing the structure of model prototypes from observed exemplars of an object. At each node in the hierarchy, a probability distribution governing its parameters must be learned. The connections between nodes reflects the structure of the object. The formulation of substructures is encouraged such that their parts become conditionally independent. The resulting model can be interpreted as a Bayesian Belief Network and also is in many respects similar to the stochastic visual grammar described by Mjolsness. 1 INTRODUCTION Modelbased object recognition solves the problem of invariant recognition by relying on stored prototypes at unit scale positioned at the ori...
Mixture Models and the EM Algorithm for Object Recognition within Compositional Hierarchies Part 1: Recognition
, 1993
"... We apply the Expectation Maximization (EM) algorithm to an assignment problem where in addition to binary assignment variables analog parameters must be estimated. As an example, we use the problem of part labelling in the context of model based object recognition where models are stored in from of ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
We apply the Expectation Maximization (EM) algorithm to an assignment problem where in addition to binary assignment variables analog parameters must be estimated. As an example, we use the problem of part labelling in the context of model based object recognition where models are stored in from of a compositional hierarchy. This problem has been formulated previously as a graph matching problem and stated in terms of minimizing an objective function that a recurrent neural network solves [11, 12, 5, 8, 22]. Mjolsness [9, 10] has introduced a stochastic visual grammar as a model for this problem; there the matching problem arises from an index renumbering operation via a permutation matrix. The optimization problem w.r.t the match variables is difficult and Mean Field Annealing techniques are used to solve it. Here we propose to model the part labelling problem in terms of a mixture of distributions, each describing the parameters of a part. Under this model, the match variables corres...
Compositionality in Neural Systems
, 1995
"... angements of symbols that are possible a priori from a mere combinatorial point of view are illegitimate as linguistic constructions. The number of character strings of length 1,000 that make up a proper English text is vanishingly small when compared to the number of all possible strings of such l ..."
Abstract
 Add to MetaCart
angements of symbols that are possible a priori from a mere combinatorial point of view are illegitimate as linguistic constructions. The number of character strings of length 1,000 that make up a proper English text is vanishingly small when compared to the number of all possible strings of such length. Thus, while infinitely productive, language is at the same time severely constrained. When observed from the "surface," the composition mechanism in language appears simple. Individual characters are assembled into syllables, which are themselves assembled into words, further composed into phrases, sentences, etc. One text differs from another text in the same language only by the relative positioning (relations) among the constituents (symbols) , and not for instance by the frequencies of occurrence of each symbol; these frequencies are about the same for any sufficiently long text. Yet, encoded within this apparently simple Elie Bienenstock an
CFART: A New MultiResolutional Adaptive Resonance System for Object Recognition
"... In this report, we propose a cascade fuzzy ART (CFART) neural network which can function as an extensible database in a modelbased object recognition system. The proposed CFART network contains multiple layers. It preserves the prominent characteristics of a fuzzy ART network and extends fuzzy ART' ..."
Abstract
 Add to MetaCart
In this report, we propose a cascade fuzzy ART (CFART) neural network which can function as an extensible database in a modelbased object recognition system. The proposed CFART network contains multiple layers. It preserves the prominent characteristics of a fuzzy ART network and extends fuzzy ART's capability toward hierarchical representation of input patterns. The learning process of the proposed network is unsupervised and selforganizing, and includes a topdown searching process and a bottomup learning process. The topdown and bottomup learning processes interact with each other in a closely coupled manner. Basically, the topdown searching guides the bottomup learning whereas the bottomup learning influences the topdown searching by changing its searching fuzzy boundary. In addition, a global searching tree is built to speed up the learning and recognition processes. The proposed CFART can accept both binary and analog inputs. With fast learning and categorization capabil...