## Hierarchical Gaussian process latent variable models (2007)

### Cached

### Download Links

- [imls.engr.oregonstate.edu]
- [www.machinelearning.org]
- [eprints.pascal-network.org]
- CiteULike
- DBLP

### Other Repositories/Bibliography

Venue: | In International Conference in Machine Learning |

Citations: | 24 - 5 self |

### BibTeX

@INPROCEEDINGS{Lawrence07hierarchicalgaussian,

author = {Neil D. Lawrence},

title = {Hierarchical Gaussian process latent variable models},

booktitle = {In International Conference in Machine Learning},

year = {2007}

}

### OpenURL

### Abstract

The Gaussian process latent variable model (GP-LVM) is a powerful approach for probabilistic modelling of high dimensional data through dimensional reduction. In this paper we extend the GP-LVM through hierarchies. A hierarchical model (such as a tree) allows us to express conditional independencies in the data as well as the manifold structure. We first introduce Gaussian process hierarchies through a simple dynamical model, we then extend the approach to a more complex hierarchy which is applied to the visualisation of human motion data sets. 1.

### Citations

7414 |
Probabilistic reasoning in intelligent systems: Networks of plausible inference
- Pearl
- 1988
(Show Context)
Citation Context ...ition (Felzenszwalb & Huttenlocher, 2000; Ioffe & Forsyth, 2001) and human pose estimation (Ramanan & Forsyth, 2003; Sigal et al., 2004; Lan & Huttenlocher, 2005). From the probabilistic perspective (=-=Pearl, 1988-=-) the tree structures (and other sparse probabilistic models) offer a convenient way to specify conditional independencies in the model. In general, it is not clear how such conditional independencies... |

510 | Probabilistic principal component analysis
- Tipping, Bishop
- 1999
(Show Context)
Citation Context ...s chosen to be linear, fi (xn) = w T i xn, and the prior over the latent variables is taken to be Gaussian, then the maximum likelihood solution of the model spans the principal subspace of the data (=-=Tipping & Bishop, 1999-=-). However, if the mapping is non-linear it is unclear, in general, how to propagate the prior distribution’s uncertainty through the nonlinearity. The alternative approach taken by the GP-LVM is to p... |

308 | Gaussian processes for machine learning
- Rasmussen
- 2006
(Show Context)
Citation Context ...e upper layer specifies the particular motion type. As the motion is broadly periodic, we made use of a periodic kernel (MacKay, 1998) for the regressive dynamics in each latent space (see pg. 92 in (=-=Rasmussen & Williams, 2006-=-) for details). The resulting visualisation is shown in Figure 5. 5. Discussion We have presented a hierarchical version of the Gaussian process latent variable model. The Gaussian process latent vari... |

173 | Style-based inverse kinematics
- Grochow, Martin, et al.
- 2004
(Show Context)
Citation Context ...ian process latent variable model (Lawrence, 2004; Lawrence, 2005) has proven to be a highly effective approach to probabilistic modelling of high dimensional data that lies on a non-linear manifold (=-=Grochow et al., 2004-=-; Urtasun et al., 2005; Urtasun et al., 2006; Ferris et al., 2007). The curse of dimensionality is finessed by assuming that the high dimensional data is intrinsically low dimensional in nature. This ... |

154 | Tracking loose-limbed people
- Sigal, Bhatia, et al.
- 2004
(Show Context)
Citation Context ...ational Conference on Machine Learning (ICML). Do not distribute. for object recognition (Felzenszwalb & Huttenlocher, 2000; Ioffe & Forsyth, 2001) and human pose estimation (Ramanan & Forsyth, 2003; =-=Sigal et al., 2004-=-; Lan & Huttenlocher, 2005). From the probabilistic perspective (Pearl, 1988) the tree structures (and other sparse probabilistic models) offer a convenient way to specify conditional independencies i... |

148 | Probabilistic non-linear principal component analysis with Gaussian process latent variable models
- Lawrence
- 2005
(Show Context)
Citation Context ...l, we then extend the approach to a more complex hierarchy which is applied to the visualisation of human motion data sets. 1. Introduction The Gaussian process latent variable model (Lawrence, 2004; =-=Lawrence, 2005-=-) has proven to be a highly effective approach to probabilistic modelling of high dimensional data that lies on a non-linear manifold (Grochow et al., 2004; Urtasun et al., 2005; Urtasun et al., 2006;... |

137 | 3d people tracking with gaussian process dynamical models
- Urtasun, Fleet, et al.
- 2006
(Show Context)
Citation Context ... 2004; Lawrence, 2005) has proven to be a highly effective approach to probabilistic modelling of high dimensional data that lies on a non-linear manifold (Grochow et al., 2004; Urtasun et al., 2005; =-=Urtasun et al., 2006-=-; Ferris et al., 2007). The curse of dimensionality is finessed by assuming that the high dimensional data is intrinsically low dimensional in nature. This reduces the effective number of parameters i... |

124 | Finding and tracking people from the bottom up
- Ramanan, Forsyth
(Show Context)
Citation Context ...nder review by the International Conference on Machine Learning (ICML). Do not distribute. for object recognition (Felzenszwalb & Huttenlocher, 2000; Ioffe & Forsyth, 2001) and human pose estimation (=-=Ramanan & Forsyth, 2003-=-; Sigal et al., 2004; Lan & Huttenlocher, 2005). From the probabilistic perspective (Pearl, 1988) the tree structures (and other sparse probabilistic models) offer a convenient way to specify conditio... |

118 |
Introduction to Gaussian processes
- MacKay
- 1998
(Show Context)
Citation Context ...es in the lower levels of the hierarchy to span the range of motions, whilst the upper layer specifies the particular motion type. As the motion is broadly periodic, we made use of a periodic kernel (=-=MacKay, 1998-=-) for the regressive dynamics in each latent space (see pg. 92 in (Rasmussen & Williams, 2006) for details). The resulting visualisation is shown in Figure 5. 5. Discussion We have presented a hierarc... |

95 | Priors for people tracking from small training sets
- Urtasun, Fleet, et al.
- 2005
(Show Context)
Citation Context ...iable model (Lawrence, 2004; Lawrence, 2005) has proven to be a highly effective approach to probabilistic modelling of high dimensional data that lies on a non-linear manifold (Grochow et al., 2004; =-=Urtasun et al., 2005-=-; Urtasun et al., 2006; Ferris et al., 2007). The curse of dimensionality is finessed by assuming that the high dimensional data is intrinsically low dimensional in nature. This reduces the effective ... |

91 | A hierarchical latent variable model for data visualization
- Bishop, Tipping
(Show Context)
Citation Context ...Other Hierarchical Models Given apparent similarities between the model names, it is natural to ask what is the relationship between the hierarchical GP-LVM and the hierarchical probabilistic PCA of (=-=Bishop & Tipping, 1998-=-)? The two models are philosophically distinct. In hierarchical PCA (and the related hierarchical GTM model of (Tino & Nabney, 2002)) every node in the hierarchy is associated with a probabilistic mod... |

89 | Gaussian process dynamical models
- Wang, Fleet, et al.
- 2006
(Show Context)
Citation Context ...ervation, yn. This makes it much easier to augment the model with additional constraints or prior information about the data. Interesting examples include adding dynamical priors in the latent space (=-=Wang et al., 2006-=-; Urtasun et al., 2006) or constraining points in the latent space according to intuitively reasonable visualisation criteria (Lawrence & Quiñonero Candela, 2006). In this paper we further exploit thi... |

57 | Combining belief networks and neural networks for scene segmentation
- Feng, Williams, et al.
- 2002
(Show Context)
Citation Context ...imensional data is to develop a latent variable model with sparse connectivity to explain the data. For example tree structured models have been suggested for modelling images (Williams & Feng, 1999; =-=Feng et al., 2002-=-; Awasthi et al., 2007), Preliminary work. Under review by the International Conference on Machine Learning (ICML). Do not distribute. for object recognition (Felzenszwalb & Huttenlocher, 2000; Ioffe ... |

53 | Beyond Trees: Common-Factor Models for 2D Human Pose Recovery
- Lan, Huttenlocher
- 2005
(Show Context)
Citation Context ...n Machine Learning (ICML). Do not distribute. for object recognition (Felzenszwalb & Huttenlocher, 2000; Ioffe & Forsyth, 2001) and human pose estimation (Ramanan & Forsyth, 2003; Sigal et al., 2004; =-=Lan & Huttenlocher, 2005-=-). From the probabilistic perspective (Pearl, 1988) the tree structures (and other sparse probabilistic models) offer a convenient way to specify conditional independencies in the model. In general, i... |

49 | Local distance preservation in the GP-LVM through back constraints - Lawrence, Quiñonero-Candela - 2006 |

48 | WiFiSLAM Using Gaussian Process Latent Variable Models
- Ferris, Fox, et al.
- 2007
(Show Context)
Citation Context ... has proven to be a highly effective approach to probabilistic modelling of high dimensional data that lies on a non-linear manifold (Grochow et al., 2004; Urtasun et al., 2005; Urtasun et al., 2006; =-=Ferris et al., 2007-=-). The curse of dimensionality is finessed by assuming that the high dimensional data is intrinsically low dimensional in nature. This reduces the effective number of parameters in the model enabling ... |

43 | Bayesian neural networks and density networks
- MacKay
- 1995
(Show Context)
Citation Context ...imultaneous localisation and mapping (Ferris et al., 2007). All make use of smooth mappings from the latent space to the data space. The probabilistic approach to non-linear dimensionality reduction (=-=MacKay, 1995-=-; Bishop et al., 1998) is to formulate a latent variable model, where the laHierarchical Gaussian Process Latent Variable Models tent dimension, q, is lower than the data dimension, d. The latent spa... |

30 |
Gaussian process models for visualisation of high dimensional data
- Lawrence
- 2004
(Show Context)
Citation Context ...e dynamical model, we then extend the approach to a more complex hierarchy which is applied to the visualisation of human motion data sets. 1. Introduction The Gaussian process latent variable model (=-=Lawrence, 2004-=-; Lawrence, 2005) has proven to be a highly effective approach to probabilistic modelling of high dimensional data that lies on a non-linear manifold (Grochow et al., 2004; Urtasun et al., 2005; Urtas... |

24 | Nabney,Hierarchical GTM: constructing localized non-linear projection manifolds in a principled way
- Tino, I
(Show Context)
Citation Context ...he hierarchical GP-LVM and the hierarchical probabilistic PCA of (Bishop & Tipping, 1998)? The two models are philosophically distinct. In hierarchical PCA (and the related hierarchical GTM model of (=-=Tino & Nabney, 2002-=-)) every node in the hierarchy is associated with a probabilistic model in data space. The hierarchy is not a hierarchy of latent variables, it is, instead, a hierarchical clustering of mixture compon... |

15 | Mixtures of trees for object recognition
- Ioffe, Forsyth
- 2001
(Show Context)
Citation Context ..., 2002; Awasthi et al., 2007), Preliminary work. Under review by the International Conference on Machine Learning (ICML). Do not distribute. for object recognition (Felzenszwalb & Huttenlocher, 2000; =-=Ioffe & Forsyth, 2001-=-) and human pose estimation (Ramanan & Forsyth, 2003; Sigal et al., 2004; Lan & Huttenlocher, 2005). From the probabilistic perspective (Pearl, 1988) the tree structures (and other sparse probabilisti... |

10 | Image modeling using tree structured conditional random fields,” IJCAI
- Awasthi, Gagrani, et al.
- 2007
(Show Context)
Citation Context ...to develop a latent variable model with sparse connectivity to explain the data. For example tree structured models have been suggested for modelling images (Williams & Feng, 1999; Feng et al., 2002; =-=Awasthi et al., 2007-=-), Preliminary work. Under review by the International Conference on Machine Learning (ICML). Do not distribute. for object recognition (Felzenszwalb & Huttenlocher, 2000; Ioffe & Forsyth, 2001) and h... |

1 | Hierarchical Gaussian Process Latent Variable Models Felzenszwalb - F, Huttenlocher - 2000 |