Results 1 
7 of
7
Nonlinear Markov Networks for Continuous Variables
, 1998
"... In this paper we address the problem of learning the structure in nonlinear Markov networks with continuousvariables. Markov networks are well suited to model relationships which do not exhibit a natural causal ordering. We use neural network structures to model the quantitative relationships betwee ..."
Abstract

Cited by 26 (4 self)
 Add to MetaCart
In this paper we address the problem of learning the structure in nonlinear Markov networks with continuousvariables. Markov networks are well suited to model relationships which do not exhibit a natural causal ordering. We use neural network structures to model the quantitative relationships between variables. Using two data sets we show that interesting structures can be found using our approach. 1 Introduction Knowledge about independence or conditional independence between variables is most helpful in "understanding" a domain. An intuitive representation of independencies is achieved by graphical stochastical models in which independency statements can be extracted from the structure of the graph. The two most popular types of graphical stochastical models are Bayesian networks which use a directed graph, and Markov networks which use an undirected graph. Whereas Bayesian networks are well suited to represent causal relationships, Markov networks are mostly used in cases where the...
Learning hybrid Bayesian networks from data
, 1998
"... We illustrate two different methodologies for learning Hybrid Bayesian networks, that is, Bayesian networks containing both continuous and discrete variables, from data. The two methodologies differ in the way of handling continuous data when learning the Bayesian network structure. The first method ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
We illustrate two different methodologies for learning Hybrid Bayesian networks, that is, Bayesian networks containing both continuous and discrete variables, from data. The two methodologies differ in the way of handling continuous data when learning the Bayesian network structure. The first methodology uses discretized data to learn the Bayesian network structure, and the original nondiscretized data for the parameterization of the learned structure. The second methodology uses nondiscretized data both to learn the Bayesian network structure and its parameterization. For the direct handling of continuous data, we propose the use of artificial neural networks as probability estimators, to be used as an integral part of the scoring metric defined to search the space of Bayesian network structures. With both methodologies, we assume the availability of a complete dataset, with no missing values or hidden variables. We report experimental results aimed at comparing the two methodologies. These results provide evidence that learning with discretized data presents advantages both in terms of efficiency and in terms of accuracy of the learned models over the alternative approach of using nondiscretized data.
Coevolutionary rulechaining genetic programming
 In Intelligent Data Engineering and Automated Learning  IDEAL 2005: 6th International Conference. Lecture Notes in Computer Science
, 2005
"... Abstract. A novel Genetic Programming (GP) paradigm called Coevolutionary RuleChaining Genetic Programming (CRGP) has been proposed to learn the relationships among attributes represented by a set of classification rules for multiclass problems. It employs backward chaining inference to carry out ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract. A novel Genetic Programming (GP) paradigm called Coevolutionary RuleChaining Genetic Programming (CRGP) has been proposed to learn the relationships among attributes represented by a set of classification rules for multiclass problems. It employs backward chaining inference to carry out classification based on the acquired acyclic rule set. Its main advantages are: 1) it can handle more than one class at a time; 2) it avoids cyclic result; 3) unlike Bayesian Network (BN), the CRGP can handle input attributes with continuous values directly; and 4) with the flexibility of GP, CRGP can learn complex relationship. We have demonstrated its better performance on one synthetic and one reallife medical data sets. 1
Learning acyclic rules based on Chaining Genetic Programming
"... Multiclass problem is the class of problems having more than one classes in the data set. Bayesian Network (BN) is a wellknown algorithm handling the multiclass problem and is applied to different areas. But BN cannot handle continuous values. In contrast, Genetic Programming (GP) can handle cont ..."
Abstract
 Add to MetaCart
(Show Context)
Multiclass problem is the class of problems having more than one classes in the data set. Bayesian Network (BN) is a wellknown algorithm handling the multiclass problem and is applied to different areas. But BN cannot handle continuous values. In contrast, Genetic Programming (GP) can handle continuous values and produces classification rules. However, GP is possible to produce cyclic rules representing tautologic, in which are useless for inference and expert systems. Coevolutionary Rulechaining Genetic Programming (CRGP) is the first variant of GP handling the multiclass problem and produces acyclic classification rules [16]. It employs backward chaining inference to carry out classification based on the acquired acyclic rule set. It can handle multiclasses; it can avoid cyclic rules; it can handle input attributes with continuous values; and it can learn complex relationships among the attributes. In this paper, we propose a novel algorithm, the Chaining Genetic Programming (CGP) learning a set of acyclic rules and to produce better results than the CRGP’s. The experimental results demonstrate that the proposed algorithm has the shorter learning process and can produce more accurate acyclic classification rules. 1
Other Professional Positions and Major Visiting Appointments: Year Position/Title Institution
"... ..."
Recent Advances in Intelligent Information Systems ISBN 9788360434598, pages 443–456 A Comparison of Structural Distance Measures for Causal Bayesian Network Models
"... We compare measures of structural distance between both, Bayesian networks and equivalence classes of Bayesian networks. The main application of these measures is in learning algorithms, where typically the interest is in how accurately a gold standard structure is retrieved by a learning algorithm. ..."
Abstract
 Add to MetaCart
We compare measures of structural distance between both, Bayesian networks and equivalence classes of Bayesian networks. The main application of these measures is in learning algorithms, where typically the interest is in how accurately a gold standard structure is retrieved by a learning algorithm. Structural distance measures can be especially useful when looking for causal structures. We discuss desirable properties of measures, review existing measures, and show some of our empirical findings concerning the performance of these metrics in practice.
ALGORITHMS FOR CONSTRAINTBASED LEARNING OF BAYESIAN NETWORK STRUCTURES WITH LARGE NUMBERS OF VARIABLES
, 2014
"... This dissertation was presented by Martijn de Jongh It was defended on ..."