Results 1 
5 of
5
Fast Kalman filtering on quasilinear dendritic trees
, 2009
"... Optimal filtering of noisy voltage signals on dendritic trees is a key problem in computational cellular neuroscience. However, the state variable in this problem — the vector of voltages at every compartment — is very highdimensional: realistic multicompartmental models often have on the order of ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Optimal filtering of noisy voltage signals on dendritic trees is a key problem in computational cellular neuroscience. However, the state variable in this problem — the vector of voltages at every compartment — is very highdimensional: realistic multicompartmental models often have on the order of N = 10 4 compartments. Standard implementations of the Kalman filter require O(N 3) time and O(N 2) space, and are therefore impractical. Here we take advantage of three special features of the dendritic filtering problem to construct an efficient filter: (1) dendritic dynamics are governed by a cable equation on a tree, which may be solved using sparse matrix methods in O(N) time; and current methods for observing dendritic voltage (2) provide low SNR observations and (3) only image a relatively small number of compartments at a time. The idea is to approximate the Kalman equations in terms of a lowrank perturbation of the steadystate (zeroSNR) solution, which may be obtained in O(N) time using methods that exploit the sparse tree structure of dendritic dynamics. The resulting methods give a very good approximation to the exact Kalman solution, but only require O(N) time and space. We illustrate the method with applications to real and simulated dendritic branching structures, and describe how to extend the techniques to incorporate spatially subsampled, temporally filtered, and nonlinearly transformed observations. 1
Semirational Models of Conditioning: The Case of Trial Order
, 2007
"... Bayesian treatments of animal conditioning start from a generative model that specifies precisely a set of assumptions about the structure of the learning task. Optimal rules for learning are direct mathematical consequences of these assumptions. In terms of Marr’s (1982) levels of analyses, the mai ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Bayesian treatments of animal conditioning start from a generative model that specifies precisely a set of assumptions about the structure of the learning task. Optimal rules for learning are direct mathematical consequences of these assumptions. In terms of Marr’s (1982) levels of analyses, the main task at the computational level
Speeding up magnetic resonance image acquisition by Bayesian multislice adaptive compressed sensing. Supplemental Appendix
, 2010
"... We show how to sequentially optimize magnetic resonance imaging measurement designs over stacks of neighbouring image slices, by performing convex variational inference on a large scale nonGaussian linear dynamical system, tracking dominating directions of posterior covariance without imposing any ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
We show how to sequentially optimize magnetic resonance imaging measurement designs over stacks of neighbouring image slices, by performing convex variational inference on a large scale nonGaussian linear dynamical system, tracking dominating directions of posterior covariance without imposing any factorization constraints. Our approach can be scaled up to highresolution images by reductions to numerical mathematics primitives and parallelization on several levels. In a first study, designs are found that improve significantly on others chosen independently for each slice or drawn at random. 1