Results 1 
6 of
6
A General Framework for Adaptive Processing of Data Structures
 IEEE TRANSACTIONS ON NEURAL NETWORKS
, 1998
"... A structured organization of information is typically required by symbolic processing. On the other hand, most connectionist models assume that data are organized according to relatively poor structures, like arrays or sequences. The framework described in this paper is an attempt to unify adaptive ..."
Abstract

Cited by 117 (46 self)
 Add to MetaCart
A structured organization of information is typically required by symbolic processing. On the other hand, most connectionist models assume that data are organized according to relatively poor structures, like arrays or sequences. The framework described in this paper is an attempt to unify adaptive models like artificial neural nets and belief nets for the problem of processing structured information. In particular, relations between data variables are expressed by directed acyclic graphs, where both numerical and categorical values coexist. The general framework proposed in this paper can be regarded as an extension of both recurrent neural networks and hidden Markov models to the case of acyclic graphs. In particular we study the supervised learning problem as the problem of learning transductions from an input structured space to an output structured space, where transductions are assumed to admit a recursive hidden statespace representation. We introduce a graphical formalism for r...
A selforganizing map for adaptive processing of structured data
 IEEE Transactions on Neural Networks
, 2003
"... Abstract—Recent developments in the area of neural networks produced models capable of dealing with structured data. Here, we propose the first fully unsupervised model, namely an extension of traditional selforganizing maps (SOMs), for the processing of labeled directed acyclic graphs (DAGs). The ..."
Abstract

Cited by 37 (13 self)
 Add to MetaCart
Abstract—Recent developments in the area of neural networks produced models capable of dealing with structured data. Here, we propose the first fully unsupervised model, namely an extension of traditional selforganizing maps (SOMs), for the processing of labeled directed acyclic graphs (DAGs). The extension is obtained by using the unfolding procedure adopted in recurrent and recursive neural networks, with the replicated neurons in the unfolded network comprising of a full SOM. This approach enables the discovery of similarities among objects including vectors consisting of numerical data. The capabilities of the model are analyzed in detail by utilizing a relatively large data set taken from an artificial benchmark problem involving visual patterns encoded as labeled DAGs. The experimental results demonstrate clearly that the proposed model is capable of exploiting both information conveyed in the labels attached to each node of the input DAGs and information encoded in the DAG topology. Index Terms—Clustering, data mining which involves novel types of data/knowledge, data reduction techniques, discovering similarities, innovative algorithms, processing labeled graphs, recurrent neural networks, recursive neural networks, self organizing maps (SOMs), vector quantization (VQ). I.
Analysis of the internal representations developed by neural networks for structures applied to quantitative structureactivity relationship studies of benzodiazepines
 J. Chem. Inf. Comput. Sci
, 2001
"... An application of recursive cascade correlation (CC) neural networks to quantitative structureactivity relationship (QSAR) studies is presented, with emphasis on the study of the internal representations developed by the neural networks. Recursive CC is a neural network model recently proposed for ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
An application of recursive cascade correlation (CC) neural networks to quantitative structureactivity relationship (QSAR) studies is presented, with emphasis on the study of the internal representations developed by the neural networks. Recursive CC is a neural network model recently proposed for the processing of structured data. It allows the direct handling of chemical compounds as labeled ordered directed graphs, and constitutes a novel approach to QSAR. The adopted representation of molecular structure captures, in a quite general and flexible way, significant topological aspects and chemical functionalities for each specific class of molecules showing a particular chemical reactivity or biological activity. A class of 1,4benzodiazepin2ones is analyzed by the proposed approach. It compares favorably versus the traditional QSAR treatment based on equations. To show the ability of the model in capturing most of the structural features that account for the biological activity, the internal representations developed by the networks are analyzed by principal component analysis. This analysis shows that the networks are able to discover relevant structural features just on the basis of the association between the molecular morphology and the target property (affinity). I.
Contextual processing of structured data by recursive cascade correlation
 IEEE Trans. on Neural Networks
, 2003
"... Abstract — We propose a first approach to deal with contextual information in structured domains by Recursive Neural Networks. The proposed model, i.e. Contextual Recursive Cascade Correlation (CRCC), a generalization of the Recursive Cascade Correlation (RCC) model, is able to partially remove the ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
Abstract — We propose a first approach to deal with contextual information in structured domains by Recursive Neural Networks. The proposed model, i.e. Contextual Recursive Cascade Correlation (CRCC), a generalization of the Recursive Cascade Correlation (RCC) model, is able to partially remove the causality assumption by exploiting contextual information stored in frozen units. We formally characterize the properties of CRCC showing that it is able to compute contextual transductions and also some causal supersource transductions that RCC cannot compute. Experimental results on controlled sequences and on a realworld task involving chemical structures confirm the computational limitations of RCC, while assessing the efficiency and efficacy of CRCC in dealing both with pure causal and contextual prediction tasks. Moreover, results obtained for the realworld task show the superiority of the proposed approach versus RCC when exploring a task for which it is not known whether the structural causality assumption holds. Index Terms — Contextual mapping, CascadeCorrelation, recurrent and recursive neural networks, neural networks for structured data, computational power, learning in structured domains. I.
Mathematical Aspects of Neural Networks
 European Symposium of Artificial Neural Networks 2003
, 2003
"... In this tutorial paper about mathematical aspects of neural networks, we will focus on two directions: on the one hand, we will motivate standard mathematical questions and well studied theory of classical neural models used in machine learning. On the other hand, we collect some recent theoretic ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
In this tutorial paper about mathematical aspects of neural networks, we will focus on two directions: on the one hand, we will motivate standard mathematical questions and well studied theory of classical neural models used in machine learning. On the other hand, we collect some recent theoretical results (as of beginning of 2003) in the respective areas. Thereby, we follow the dichotomy offered by the overall network structure and restrict ourselves to feedforward networks, recurrent networks, and selforganizing neural systems, respectively.
Universal Approximation Capability of Cascade Correlation for Structures
, 2004
"... Cascade correlation (CC) constitutes a training method for neural networks which determines the weights as well as the neural architecture during training. Various extensions of CC to structured data have been proposed: recurrent cascade correlation (RCC) for sequences, recursive cascade correlati ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Cascade correlation (CC) constitutes a training method for neural networks which determines the weights as well as the neural architecture during training. Various extensions of CC to structured data have been proposed: recurrent cascade correlation (RCC) for sequences, recursive cascade correlation (RecCC) for tree structures with limited