Results 1  10
of
18
Dimensions of Neuralsymbolic Integration  A Structured Survey
 We Will Show Them: Essays in Honour of Dov Gabbay
, 2005
"... Introduction Research on integrated neuralsymbolic systems has made significant progress in the recent past. In particular the understanding of ways to deal with symbolic knowledge within connectionist systems (also called artificial neural networks) has reached a critical mass which enables the ..."
Abstract

Cited by 21 (6 self)
 Add to MetaCart
Introduction Research on integrated neuralsymbolic systems has made significant progress in the recent past. In particular the understanding of ways to deal with symbolic knowledge within connectionist systems (also called artificial neural networks) has reached a critical mass which enables the community to strive for applicable implementations and use cases. Recent work has covered a great variety of logics used in artificial intelligence and provides a multitude of techniques for dealing with them within the context of artificial neural networks. Already in the pioneering days of computational models of neural cognition, the question was raised how symbolic knowledge can be represented and dealt with within neural networks. The landmark paper [McCulloch and Pitts, 1943] provides fundamental insights how propositional logic can be processed using simple artificial neural networks. Within the following decades, however, the topic did not receive much attention as research in arti
Neural Gas for Sequences
 Proceedings of the Workshop on SelfOrganizing Networks (WSOM), pages 53–58, Kyushu Institute of Technology
, 2003
"... For unsupervised sequence processing, standard self organizing maps can be naturally extended by using recurrent connections and explicit context representations. Models thereof are the temporal Kohonen map (TKM), recursive SOM, SOM for structured data (SOMSD), and HSOM for sequences (HSOMS). Here, ..."
Abstract

Cited by 15 (3 self)
 Add to MetaCart
For unsupervised sequence processing, standard self organizing maps can be naturally extended by using recurrent connections and explicit context representations. Models thereof are the temporal Kohonen map (TKM), recursive SOM, SOM for structured data (SOMSD), and HSOM for sequences (HSOMS). Here, we discuss and compare the capabilities of exemplary approaches to store different types of sequences. We propose a new efficient model, the MergeSOM (MSOM), which combines ideas of TKM and SOMSD and which is particularly suited for processing sequences with dynamic multimodal densities.
Unsupervised Recursive Sequence Processing
 Neurocomputing
, 2003
"... We propose a self organizing map (SOM) for sequences by extending standard SOM by two features, the recursive update of Sperduti [7] and the hyperbolic neighborhood of Ritter [5]. While the former integrates the currently presented item and recent map activations, the latter allows representatio ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
We propose a self organizing map (SOM) for sequences by extending standard SOM by two features, the recursive update of Sperduti [7] and the hyperbolic neighborhood of Ritter [5]. While the former integrates the currently presented item and recent map activations, the latter allows representation of temporally possibly exponentially growing sequence diversification. Discrete and realvalued sequences can be processed e#ciently with this method as demonstrated in three experiments.
Neural Methods for NonStandard Data
 proceedings of the 12 th European Symposium on Artificial Neural Networks (ESANN 2004), dside pub
, 2004
"... Standard pattern recognition provides effective and noisetolerant tools for machine learning tasks; however, most approaches only deal with real vectors of a finite and fixed dimensionality. In this tutorial paper, we give an overview about extensions of pattern recognition towards nonstandard ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
Standard pattern recognition provides effective and noisetolerant tools for machine learning tasks; however, most approaches only deal with real vectors of a finite and fixed dimensionality. In this tutorial paper, we give an overview about extensions of pattern recognition towards nonstandard data which are not contained in a finite dimensional space, such as strings, sequences, trees, graphs, or functions. Two major directions can be distinguished in the neural networks literature: models can be based on a similarity measure adapted to nonstandard data, including kernel methods for structures as a very prominent approach, but also alternative metric based algorithms and functional networks; alternatively, nonstandard data can be processed recursively within supervised and unsupervised recurrent and recursive networks and fully recurrent systems.
Mathematical Aspects of Neural Networks
 European Symposium of Artificial Neural Networks 2003
, 2003
"... In this tutorial paper about mathematical aspects of neural networks, we will focus on two directions: on the one hand, we will motivate standard mathematical questions and well studied theory of classical neural models used in machine learning. On the other hand, we collect some recent theoretic ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
In this tutorial paper about mathematical aspects of neural networks, we will focus on two directions: on the one hand, we will motivate standard mathematical questions and well studied theory of classical neural models used in machine learning. On the other hand, we collect some recent theoretical results (as of beginning of 2003) in the respective areas. Thereby, we follow the dichotomy offered by the overall network structure and restrict ourselves to feedforward networks, recurrent networks, and selforganizing neural systems, respectively.
Dynamics and topographic organization in recursive selforganizing map
 NEURAL COMPUTATION
, 2006
"... Recently, there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. However, at present, there is no general consensus as to how best to process sequences using topographic maps and this topic remains a very a ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Recently, there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. However, at present, there is no general consensus as to how best to process sequences using topographic maps and this topic remains a very active focus of current neurocomputational research. The representational capabilities and internal representations of the models are not well understood. We rigorously analyze a generalization of the SelfOrganizing Map (SOM) for processing sequential data, Recursive SOM (RecSOM) (Voegtlin, 2002), as a nonautonomous dynamical system consisting of a set of fixed input maps. We argue that contractive fixed input maps are likely to produce Markovian organizations of receptive fields on the RecSOM map. We derive bounds on parameter β (weighting the importance of importing past information when processing sequences) under which contractiveness of the fixed input maps is guaranteed. Some generalizations of SOM contain a dynamic module responsible for processing temporal contexts as an integral part of the model. We show that Markovian topographic maps of sequential data can be produced using a simple fixed (nonadaptable) dynamic module externally feeding a standard topographic model designed to process static vectorial data of fixed dimensionality (e.g. SOM). However, by allowing trainable feedback connections one can obtain Markovian maps with superior memory depth and topography preservation. We elaborate upon the importance of nonMarkovian organizations in topographic maps of 2sequential data.
Topographic Organization of Receptive Fields in Recursive SelfOrganizing Map
 In Advances in Natural Computation (pp. 676685). Lecture Notes in Computer Science
, 2005
"... Abstract. Recently, there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. The representational capabilities and internal representations of the models are not well understood. We concentrate on a generaliza ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Abstract. Recently, there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. The representational capabilities and internal representations of the models are not well understood. We concentrate on a generalization of the SelfOrganizing Map (SOM) for processing sequential data – the Recursive SOM (RecSOM [1]). We argue that contractive fixedinput dynamics of RecSOM is likely to lead to Markovian organizations of receptive fields on the map. We show that Markovian topographic maps of sequential data can be produced using a simple fixed (nonadaptable) dynamic module externally feeding a standard topographic model designed to process static vectorial data of fixed dimensionality (e.g. SOM). We elaborate upon the importance of nonMarkovian organizations in topographic maps of sequential data. 1
Visualisation of treestructured data through generative probabilistic modelling, in this volume
"... We present a generative probabilistic model for the topographic mapping of tree structured data. The model is formulated as constrained mixture of hidden Markov tree models. A natural measure of likelihood arises as a cost function that guides the model fitting. We compare our approach with an exist ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
We present a generative probabilistic model for the topographic mapping of tree structured data. The model is formulated as constrained mixture of hidden Markov tree models. A natural measure of likelihood arises as a cost function that guides the model fitting. We compare our approach with an existing neuralbased methodology for constructing topographic maps of directed acyclic graphs. We argue that the probabilistic nature of our model brings several advantages, such as principled interpretation of the visualisation plots. 1
A General Framework for SelfOrganizing Structure Processing Neural Networks
, 2003
"... Selforganization constitutes an important paradigm in machine learning with successful applications e.g. for data and webmining. However, so far most approaches have been proposed for data contained in a fixed and finite dimensional vector space. We will focus on extensions for more general dat ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
Selforganization constitutes an important paradigm in machine learning with successful applications e.g. for data and webmining. However, so far most approaches have been proposed for data contained in a fixed and finite dimensional vector space. We will focus on extensions for more general data structures like sequences and tree structures in this article. Various extensions of the standard selforganizing map (SOM) to sequences or tree structures have been proposed in the literature: the temporal Kohonen map, the recursive SOM, and SOM for structured data (SOMSD), for example. These methods enhance the standard SOM by recursive connections. We define in this article a general recursive dynamic which enables the recursive processing of complex data structures based on recursively computed internal representations of the respective context. The above mechanisms of SOMs for structures are special cases of the proposed general dynamic, furthermore, the dynamic covers the supervised case of recurrent and recursive networks, too. The general framework offers a uniform notation for training mechanisms such as Hebbian learning and the transfer of alternatives such as vector quantization or the neural gas algorithm to structure processing networks. The formal definition of the recursive dynamic for structure processing unsupervised networks allows the transfer of theoretical issues from the SOM literature to the structure processing case. One can formulate general cost functions corresponding to vector quantization, neural gas, and a modification of SOM for the case of structures. The cost functions can be compared to Hebbian learning which can be interpreted as an approximation of a stochastic gradient descent. We derive as an alternative the exact gradien...