Results 1 
3 of
3
Learning Programs: A Hierarchical Bayesian Approach
"... We are interested in learning programs for multiple related tasks given only a few training examples per task. Since the program for a single task is underdetermined by its data, we introduce a nonparametric hierarchical Bayesian prior over programs which shares statistical strength across multiple ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
We are interested in learning programs for multiple related tasks given only a few training examples per task. Since the program for a single task is underdetermined by its data, we introduce a nonparametric hierarchical Bayesian prior over programs which shares statistical strength across multiple tasks. The key challenge is to parametrize this multitask sharing. For this, we introduce a new representation of programs based on combinatory logic and provide an MCMC algorithm that can perform safe program transformations on this representation to reveal shared interprogram substructures. 1.
7. Genetic Programming Genetic Programming
"... Abstract Welcome to genetic programming, where the forces of nature are used to automatically evolve computer programs. We give a flavour of where GP has been successfully applied (it is far too wide an area to cover everything) and interesting current and future research but start with a tutorial o ..."
Abstract
 Add to MetaCart
Abstract Welcome to genetic programming, where the forces of nature are used to automatically evolve computer programs. We give a flavour of where GP has been successfully applied (it is far too wide an area to cover everything) and interesting current and future research but start with a tutorial of how to get started and finish with common pitfalls to avoid. 1
Proceedings of the TwentyThird International Joint Conference on Artificial Intelligence Bootstrap Learning via Modular Concept Discovery
"... Suppose a learner is faced with a domain of problems about which it knows nearly nothing. It does not know the distribution of problems, the space of solutions is not smooth, and the reward signal is uninformative, providing perhaps a few bits of information but not enough to steer the learner effec ..."
Abstract
 Add to MetaCart
Suppose a learner is faced with a domain of problems about which it knows nearly nothing. It does not know the distribution of problems, the space of solutions is not smooth, and the reward signal is uninformative, providing perhaps a few bits of information but not enough to steer the learner effectively. How can such a learner ever get off the ground? A common intuition is that if the solutions to these problems share a common structure, and the learner can solve some simple problems by brute force, it should be able to extract useful components from these solutions and, by composing them, explore the solution space more efficiently. Here, we formalize this intuition, where the solution space is that of typed functional programs and the gained information is stored as a stochastic grammar over programs. We propose an iterative procedure for exploring such spaces: in the first step of each iteration, the learner explores a finite subset of the domain, guided by a stochastic grammar; in the second step, the learner compresses the successful solutions from the first step to estimate a new stochastic grammar. We test this procedure on symbolic regression and Boolean circuit learning and show that the learner discovers modular concepts for these domains. Whereas the learner is able to solve almost none of the posed problems in the procedure’s first iteration, it rapidly becomes able to solve a large number by gaining abstract knowledge of the structure of the solution space. 1