Results 1 
2 of
2
Learning Bayesian networks: The combination of knowledge and statistical data
 Machine Learning
, 1995
"... We describe scoring metrics for learning Bayesian networks from a combination of user knowledge and statistical data. We identify two important properties of metrics, which we call event equivalence and parameter modularity. These properties have been mostly ignored, but when combined, greatly simpl ..."
Abstract

Cited by 905 (34 self)
 Add to MetaCart
We describe scoring metrics for learning Bayesian networks from a combination of user knowledge and statistical data. We identify two important properties of metrics, which we call event equivalence and parameter modularity. These properties have been mostly ignored, but when combined, greatly simplify the encoding of a user’s prior knowledge. In particular, a user can express his knowledge—for the most part—as a single prior Bayesian network for the domain. 1
An Algorithmic Framework For Density Estimation Based Evolutionary Algorithms
, 1999
"... The direct application of statistics to stochastic optimization in evolutionary computation has become more important and present over the last few years. With the introduction of the notion of the Estimation of Distribution Algorithm (EDA), a new line of research has been named. The application are ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
The direct application of statistics to stochastic optimization in evolutionary computation has become more important and present over the last few years. With the introduction of the notion of the Estimation of Distribution Algorithm (EDA), a new line of research has been named. The application area so far has mostly been the same as for the classic genetic algorithms, being the binary vector encoded problems. The most important aspect in the new algorithms is the part where probability densities are estimated. In probability theory, a distinction is made between discrete and continuous distributions and methods. Using the rationale for density estimation based evolutionary algorithms, we present an algorithmic framework for them, named IDEA. This allows us to define such algorithms for vectors of both continuous and discrete random variables, combining techniques from existing EDAs as well as density estimation theory. The emphasis is on techniques for vectors of continuous random variables, for which we present new algorithms in the field of density estimation based evolutionary algorithms, using two different density estimation models.