Results 1  10
of
44
Dimensions of neuralsymbolic integration – a structural survey
 We Will Show Them: Essays in Honour of Dov Gabbay
"... Research on integrated neuralsymbolic systems has made significant progress in the recent past. In particular the understanding of ways to deal with symbolic knowledge within connectionist systems (also called artificial neural networks) has reached a critical mass which enables the community to ..."
Abstract

Cited by 25 (8 self)
 Add to MetaCart
Research on integrated neuralsymbolic systems has made significant progress in the recent past. In particular the understanding of ways to deal with symbolic knowledge within connectionist systems (also called artificial neural networks) has reached a critical mass which enables the community to
Neural Gas for Sequences
 Proceedings of the Workshop on SelfOrganizing Networks (WSOM), pages 53–58, Kyushu Institute of Technology
, 2003
"... For unsupervised sequence processing, standard self organizing maps can be naturally extended by using recurrent connections and explicit context representations. Models thereof are the temporal Kohonen map (TKM), recursive SOM, SOM for structured data (SOMSD), and HSOM for sequences (HSOMS). Here, ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
For unsupervised sequence processing, standard self organizing maps can be naturally extended by using recurrent connections and explicit context representations. Models thereof are the temporal Kohonen map (TKM), recursive SOM, SOM for structured data (SOMSD), and HSOM for sequences (HSOMS). Here, we discuss and compare the capabilities of exemplary approaches to store different types of sequences. We propose a new efficient model, the MergeSOM (MSOM), which combines ideas of TKM and SOMSD and which is particularly suited for processing sequences with dynamic multimodal densities.
Unsupervised Recursive Sequence Processing
 Neurocomputing
, 2003
"... We propose a self organizing map (SOM) for sequences by extending standard SOM by two features, the recursive update of Sperduti [7] and the hyperbolic neighborhood of Ritter [5]. While the former integrates the currently presented item and recent map activations, the latter allows representatio ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
(Show Context)
We propose a self organizing map (SOM) for sequences by extending standard SOM by two features, the recursive update of Sperduti [7] and the hyperbolic neighborhood of Ritter [5]. While the former integrates the currently presented item and recent map activations, the latter allows representation of temporally possibly exponentially growing sequence diversification. Discrete and realvalued sequences can be processed e#ciently with this method as demonstrated in three experiments.
Neural Methods for NonStandard Data
 proceedings of the 12 th European Symposium on Artificial Neural Networks (ESANN 2004), dside pub
, 2004
"... Standard pattern recognition provides effective and noisetolerant tools for machine learning tasks; however, most approaches only deal with real vectors of a finite and fixed dimensionality. In this tutorial paper, we give an overview about extensions of pattern recognition towards nonstandard ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
(Show Context)
Standard pattern recognition provides effective and noisetolerant tools for machine learning tasks; however, most approaches only deal with real vectors of a finite and fixed dimensionality. In this tutorial paper, we give an overview about extensions of pattern recognition towards nonstandard data which are not contained in a finite dimensional space, such as strings, sequences, trees, graphs, or functions. Two major directions can be distinguished in the neural networks literature: models can be based on a similarity measure adapted to nonstandard data, including kernel methods for structures as a very prominent approach, but also alternative metric based algorithms and functional networks; alternatively, nonstandard data can be processed recursively within supervised and unsupervised recurrent and recursive networks and fully recurrent systems.
Dynamics and topographic organization in recursive selforganizing map
 NEURAL COMPUTATION
, 2006
"... Recently, there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. However, at present, there is no general consensus as to how best to process sequences using topographic maps and this topic remains a very a ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
Recently, there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. However, at present, there is no general consensus as to how best to process sequences using topographic maps and this topic remains a very active focus of current neurocomputational research. The representational capabilities and internal representations of the models are not well understood. We rigorously analyze a generalization of the SelfOrganizing Map (SOM) for processing sequential data, Recursive SOM (RecSOM) (Voegtlin, 2002), as a nonautonomous dynamical system consisting of a set of fixed input maps. We argue that contractive fixed input maps are likely to produce Markovian organizations of receptive fields on the RecSOM map. We derive bounds on parameter β (weighting the importance of importing past information when processing sequences) under which contractiveness of the fixed input maps is guaranteed. Some generalizations of SOM contain a dynamic module responsible for processing temporal contexts as an integral part of the model. We show that Markovian topographic maps of sequential data can be produced using a simple fixed (nonadaptable) dynamic module externally feeding a standard topographic model designed to process static vectorial data of fixed dimensionality (e.g. SOM). However, by allowing trainable feedback connections one can obtain Markovian maps with superior memory depth and topography preservation. We elaborate upon the importance of nonMarkovian organizations in topographic maps of 2sequential data.
Mathematical Aspects of Neural Networks
 European Symposium of Artificial Neural Networks 2003
, 2003
"... In this tutorial paper about mathematical aspects of neural networks, we will focus on two directions: on the one hand, we will motivate standard mathematical questions and well studied theory of classical neural models used in machine learning. On the other hand, we collect some recent theoretic ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
In this tutorial paper about mathematical aspects of neural networks, we will focus on two directions: on the one hand, we will motivate standard mathematical questions and well studied theory of classical neural models used in machine learning. On the other hand, we collect some recent theoretical results (as of beginning of 2003) in the respective areas. Thereby, we follow the dichotomy offered by the overall network structure and restrict ourselves to feedforward networks, recurrent networks, and selforganizing neural systems, respectively.
Topographic Organization of Receptive Fields in Recursive SelfOrganizing Map
 In Advances in Natural Computation (pp. 676685). Lecture Notes in Computer Science
, 2005
"... Abstract. Recently, there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. The representational capabilities and internal representations of the models are not well understood. We concentrate on a generaliza ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Recently, there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. The representational capabilities and internal representations of the models are not well understood. We concentrate on a generalization of the SelfOrganizing Map (SOM) for processing sequential data – the Recursive SOM (RecSOM [1]). We argue that contractive fixedinput dynamics of RecSOM is likely to lead to Markovian organizations of receptive fields on the map. We show that Markovian topographic maps of sequential data can be produced using a simple fixed (nonadaptable) dynamic module externally feeding a standard topographic model designed to process static vectorial data of fixed dimensionality (e.g. SOM). We elaborate upon the importance of nonMarkovian organizations in topographic maps of sequential data. 1
SelfOrganizing Maps for Time Series
, 2005
"... We review a recent extension of the selforganizing map (SOM) for temporal structures with a simple recurrent dynamics leading to sparse representations, which allows an efficient training and a combination with arbitrary lattice structures. We discuss its practical applicability and its theoretical ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
We review a recent extension of the selforganizing map (SOM) for temporal structures with a simple recurrent dynamics leading to sparse representations, which allows an efficient training and a combination with arbitrary lattice structures. We discuss its practical applicability and its theoretical properties. Afterwards, we put the approach into a general framework of recurrent unsupervised models. This generic formulation also covers a variety of wellknown alternative approaches including the temporal Kohonen map, the recursive SOM, and SOM for structured data. Based on this formulation, mathematical properties of the models are investigated. Interestingly, the dynamic can be generalized from sequences to more general tree structures thus opening the way to unsupervised processing of general data structures.
Visualisation of treestructured data through generative probabilistic modelling, in this volume
"... We present a generative probabilistic model for the topographic mapping of tree structured data. The model is formulated as constrained mixture of hidden Markov tree models. A natural measure of likelihood arises as a cost function that guides the model fitting. We compare our approach with an exist ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
We present a generative probabilistic model for the topographic mapping of tree structured data. The model is formulated as constrained mixture of hidden Markov tree models. A natural measure of likelihood arises as a cost function that guides the model fitting. We compare our approach with an existing neuralbased methodology for constructing topographic maps of directed acyclic graphs. We argue that the probabilistic nature of our model brings several advantages, such as principled interpretation of the visualisation plots. 1