Results

**1 - 4**of**4**### Advance Access publication on June 18, 2008 doi:10.1093/comjnl/bxm117

"... One of the second generation of computer scientists, Chris Wallace completed his tertiary education in 1959 with a Ph.D. in nuclear physics, on cosmic ray showers, under Dr Paul George at Sydney University. Needless to say, computer science was not, at that stage, an established academic discipline. ..."

Abstract
- Add to MetaCart

(Show Context)
One of the second generation of computer scientists, Chris Wallace completed his tertiary education in 1959 with a Ph.D. in nuclear physics, on cosmic ray showers, under Dr Paul George at Sydney University. Needless to say, computer science was not, at that stage, an established academic discipline. With Max Brennan 1 andJohnMaloshehaddesignedand built a large automatic data logging system for recording cosmic ray air shower events and with Max Brennan also developed a complex computer programme for Bayesian analysis of cosmic ray events on the recently installed SILLIAC computer. Appointed lecturer in Physics at Sydney in 1960 he was sent almost immediately to the University of Illinois to copy the design of ILLIAC II, a duplicate of which was to be built at Sydney. ILLIAC II was not in fact completed at that stage and, after an initial less than warm welcome by a department who seemed unsure exactly what this Australian was doing in their midst, his talents were recognized and he was invited to join their staff (under very generous conditions) to assist in ILLIAC II design 2. He remained there for two years helping in particular to design the input output channels and aspects of the advanced control unit (first stage pipeline). In the event, Sydney decided it would be too expensive to build a copy of ILLIAC II, although a successful copy (the Golem) was built in Israel using circuit designs developed by Wallace and Ken Smith. In spite of the considerable financial and academic inducements to remain in America, Wallace returned to Australia after three months spent in England familiarizing himself with the KDF9 computer being purchased by Sydney University to replace SILLIAC. Returning to the School of Physics he joined the Basser

### Abstract DRAFT

"... Build your own probability monads Probability is often counter-intuitive, and it always involves a great deal of math. This is unfortunate, because many applications in robotics and AI increasingly rely on probability theory. We introduce a modular toolkit for constructing probability monads, and sh ..."

Abstract
- Add to MetaCart

(Show Context)
Build your own probability monads Probability is often counter-intuitive, and it always involves a great deal of math. This is unfortunate, because many applications in robotics and AI increasingly rely on probability theory. We introduce a modular toolkit for constructing probability monads, and show that it can be used for everything from discrete distributions to weighted particle filtering. This modular approach allows us to present a single, easy-to-use API for working with many kinds of probability distributions. Our toolkit combines several existing components (the list monad, the Rand monad, and the MaybeT monad transformer), with a stripped down version of WriterT Prob, and a new monad for sequential Monte Carlo sampling. Using these components, we show that MaybeT can be used to implement Bayes ’ theorem. We also show how to implement a monad for weighted particle filtering.

### Added Distributions for use in Clustering (Mixture Modelling), Function Models, Regression Trees, Segmentation, and mixed Bayesian Networks in Inductive Programming 1.2.

, 2008

"... Abstract. Inductive programming is a machine learning paradigm com-bining functional programming (FP) with the information theoretic crite-rion,Minimum Message Length (MML). IP 1.2 now includes the Geomet-ric and Poisson distributions over non-negative integers, and Student’s t-Distribution over con ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract. Inductive programming is a machine learning paradigm com-bining functional programming (FP) with the information theoretic crite-rion,Minimum Message Length (MML). IP 1.2 now includes the Geomet-ric and Poisson distributions over non-negative integers, and Student’s t-Distribution over continuous values, as well as the Multinomial and Normal (Gaussian) distributions from before. All of these can be used with IP’s model-transformation operators, and structure-learning algo-rithms including clustering (mixture-models), classification- (decision-) trees and other regressions, and mixed Bayesian networks, provided only that the types match between each corresponding component Model, transformation, structured model, and variable – discrete, continuous, sequence, multivariate, and so on.