Results 1  10
of
54
A general framework for unsupervised processing of structured data
 NEUROCOMPUTING
, 2004
"... ..."
(Show Context)
Learning shapeclasses using a mixture of treeunions
 IEEE Trans. PAMI
, 2006
"... Abstract—This paper poses the problem of treeclustering as that of fitting a mixture of tree unions to a set of sample trees. The treeunions are structures from which the individual data samples belonging to a cluster can be obtained by edit operations. The distribution of observed tree nodes in ea ..."
Abstract

Cited by 27 (7 self)
 Add to MetaCart
(Show Context)
Abstract—This paper poses the problem of treeclustering as that of fitting a mixture of tree unions to a set of sample trees. The treeunions are structures from which the individual data samples belonging to a cluster can be obtained by edit operations. The distribution of observed tree nodes in each cluster sample is assumed to be governed by a Bernoulli distribution. The clustering method is designed to operate when the correspondences between nodes are unknown and must be inferred as part of the learning process. We adopt a minimum description length approach to the problem of fitting the mixture model to data. We make maximumlikelihood estimates of the Bernoulli parameters. The treeunions and the mixing proportions are sought so as to minimize the description length criterion. This is the sum of the negative logarithm of the Bernoulli distribution, and a messagelength criterion that encodes both the complexity of the uniontrees and the number of mixture components. We locate node correspondences by minimizing the edit distance with the current tree unions, and show that the edit distance is linked to the description length criterion. The method can be applied to both unweighted and weighted trees. We illustrate the utility of the resulting algorithm on the problem of classifying 2D shapes using a shock graph representation. Index Terms—Structural learning, tree clustering, mixture modelinq, minimum description length, model codes, shock graphs. 1
Neural Gas for Sequences
 Proceedings of the Workshop on SelfOrganizing Networks (WSOM), pages 53–58, Kyushu Institute of Technology
, 2003
"... For unsupervised sequence processing, standard self organizing maps can be naturally extended by using recurrent connections and explicit context representations. Models thereof are the temporal Kohonen map (TKM), recursive SOM, SOM for structured data (SOMSD), and HSOM for sequences (HSOMS). Here, ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
For unsupervised sequence processing, standard self organizing maps can be naturally extended by using recurrent connections and explicit context representations. Models thereof are the temporal Kohonen map (TKM), recursive SOM, SOM for structured data (SOMSD), and HSOM for sequences (HSOMS). Here, we discuss and compare the capabilities of exemplary approaches to store different types of sequences. We propose a new efficient model, the MergeSOM (MSOM), which combines ideas of TKM and SOMSD and which is particularly suited for processing sequences with dynamic multimodal densities.
Neural Methods for NonStandard Data
 proceedings of the 12 th European Symposium on Artificial Neural Networks (ESANN 2004), dside pub
, 2004
"... Standard pattern recognition provides effective and noisetolerant tools for machine learning tasks; however, most approaches only deal with real vectors of a finite and fixed dimensionality. In this tutorial paper, we give an overview about extensions of pattern recognition towards nonstandard ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
(Show Context)
Standard pattern recognition provides effective and noisetolerant tools for machine learning tasks; however, most approaches only deal with real vectors of a finite and fixed dimensionality. In this tutorial paper, we give an overview about extensions of pattern recognition towards nonstandard data which are not contained in a finite dimensional space, such as strings, sequences, trees, graphs, or functions. Two major directions can be distinguished in the neural networks literature: models can be based on a similarity measure adapted to nonstandard data, including kernel methods for structures as a very prominent approach, but also alternative metric based algorithms and functional networks; alternatively, nonstandard data can be processed recursively within supervised and unsupervised recurrent and recursive networks and fully recurrent systems.
The graph neural network model
 IEEE Transactions on Neural Networks
, 2009
"... The graph neural network model Many underlying relationships among data in several areas of science and engineering, e.g., computer vision, molecular chemistry, molecular biology, pattern recognition, and data mining, can be represented in terms of graphs. In this paper, we propose a new neural netw ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
(Show Context)
The graph neural network model Many underlying relationships among data in several areas of science and engineering, e.g., computer vision, molecular chemistry, molecular biology, pattern recognition, and data mining, can be represented in terms of graphs. In this paper, we propose a new neural network model, called graph neural network (GNN) model, that extends existing neural network methods for processing the data represented in graph domains. This GNN model, which can directly process most of the practically useful types of graphs, e.g., acyclic, cyclic, directed, and undirected, implements a function tau(G,n) isin IRm that maps a graph G and one of its nodes n into an mdimensional Euclidean space. A supervised learning algorithm is derived to estimate the parameters of the proposed GNN model. The computational cost of the proposed algorithm is also considered. Some experimental results are shown to validate the proposed learning algorithm, and to demonstrate its generalization capabilities.
Dynamics and topographic organization in recursive selforganizing map
 NEURAL COMPUTATION
, 2006
"... Recently, there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. However, at present, there is no general consensus as to how best to process sequences using topographic maps and this topic remains a very a ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
Recently, there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. However, at present, there is no general consensus as to how best to process sequences using topographic maps and this topic remains a very active focus of current neurocomputational research. The representational capabilities and internal representations of the models are not well understood. We rigorously analyze a generalization of the SelfOrganizing Map (SOM) for processing sequential data, Recursive SOM (RecSOM) (Voegtlin, 2002), as a nonautonomous dynamical system consisting of a set of fixed input maps. We argue that contractive fixed input maps are likely to produce Markovian organizations of receptive fields on the RecSOM map. We derive bounds on parameter β (weighting the importance of importing past information when processing sequences) under which contractiveness of the fixed input maps is guaranteed. Some generalizations of SOM contain a dynamic module responsible for processing temporal contexts as an integral part of the model. We show that Markovian topographic maps of sequential data can be produced using a simple fixed (nonadaptable) dynamic module externally feeding a standard topographic model designed to process static vectorial data of fixed dimensionality (e.g. SOM). However, by allowing trainable feedback connections one can obtain Markovian maps with superior memory depth and topography preservation. We elaborate upon the importance of nonMarkovian organizations in topographic maps of 2sequential data.
Topographic Organization of Receptive Fields in Recursive SelfOrganizing Map
 In Advances in Natural Computation (pp. 676685). Lecture Notes in Computer Science
, 2005
"... Abstract. Recently, there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. The representational capabilities and internal representations of the models are not well understood. We concentrate on a generaliza ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Recently, there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. The representational capabilities and internal representations of the models are not well understood. We concentrate on a generalization of the SelfOrganizing Map (SOM) for processing sequential data – the Recursive SOM (RecSOM [1]). We argue that contractive fixedinput dynamics of RecSOM is likely to lead to Markovian organizations of receptive fields on the map. We show that Markovian topographic maps of sequential data can be produced using a simple fixed (nonadaptable) dynamic module externally feeding a standard topographic model designed to process static vectorial data of fixed dimensionality (e.g. SOM). We elaborate upon the importance of nonMarkovian organizations in topographic maps of sequential data. 1
Mathematical Aspects of Neural Networks
 European Symposium of Artificial Neural Networks 2003
, 2003
"... In this tutorial paper about mathematical aspects of neural networks, we will focus on two directions: on the one hand, we will motivate standard mathematical questions and well studied theory of classical neural models used in machine learning. On the other hand, we collect some recent theoretic ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
In this tutorial paper about mathematical aspects of neural networks, we will focus on two directions: on the one hand, we will motivate standard mathematical questions and well studied theory of classical neural models used in machine learning. On the other hand, we collect some recent theoretical results (as of beginning of 2003) in the respective areas. Thereby, we follow the dichotomy offered by the overall network structure and restrict ourselves to feedforward networks, recurrent networks, and selforganizing neural systems, respectively.
M.: Clustering XML documents using selforganizing maps for structures. In: Workshop of the INitiative for the Evaluation of XML Retrieval
, 2005
"... Abstract. SelfOrganizing Maps capable of encoding structured information will be used for the clustering of XML documents. Documents formatted in XML are appropriately represented as graph data structures. It will be shown that the SelfOrganizing Maps can be trained in an unsupervised fashion to g ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
Abstract. SelfOrganizing Maps capable of encoding structured information will be used for the clustering of XML documents. Documents formatted in XML are appropriately represented as graph data structures. It will be shown that the SelfOrganizing Maps can be trained in an unsupervised fashion to group XML structured data into clusters, and that this task is scaled in linear time with increasing size of the corpus. It will also be shown that some simple prior knowledge of the data structures is beneficial to the efficient grouping of the XML documents. 1