## Bayesian Methods for Neural Networks (1999)

Citations: | 9 - 0 self |

### BibTeX

@TECHREPORT{Freitas99bayesianmethods,

author = {João F. G. De Freitas},

title = {Bayesian Methods for Neural Networks},

institution = {},

year = {1999}

}

### OpenURL

### Abstract

Summary The application of the Bayesian learning paradigm to neural networks results in a flexi-ble and powerful nonlinear modelling framework that can be used for regression, den-sity estimation, prediction and classification. Within this framework, all sources of uncertainty are expressed and measured by probabilities. This formulation allows for a probabilistic treatment of our a priori knowledge, domain specific knowledge, model selection schemes, parameter estimation methods and noise estimation techniques. Many researchers have contributed towards the development of the Bayesian learn-ing approach for neural networks. This thesis advances this research by proposing several novel extensions in the areas of sequential learning, model selection, optimi-sation and convergence assessment. The first contribution is a regularisation strategy for sequential learning based on extended Kalman filtering and noise estimation via evidence maximisation. Using the expectation maximisation (EM) algorithm, a similar algorithm is derived for batch learning. Much of the thesis is, however, devoted to Monte Carlo simulation methods. A robust Bayesian method is proposed to estimate,