Results 1  10
of
31
A Languagebased Approach to Measuring Scholarly Impact
"... Identifying the most influential documents in a corpus is an important problem in many fields, from information science and historiography to text summarization and news aggregation. Unfortunately, traditional bibliometrics such as citations are often not available. We propose using changes in the t ..."
Abstract

Cited by 44 (1 self)
 Add to MetaCart
(Show Context)
Identifying the most influential documents in a corpus is an important problem in many fields, from information science and historiography to text summarization and news aggregation. Unfortunately, traditional bibliometrics such as citations are often not available. We propose using changes in the thematic content of documents over time to measure the importance of individual documents within the collection. We describe a dynamic topic model for both quantifying and qualifying the impact of these documents. We validate the model by analyzing three large corpora of scientific articles. Our measurement of a document’s impact correlates significantly with its number of citations. 1
Kernels for VectorValued Functions: a Review
, 2011
"... Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kern ..."
Abstract

Cited by 32 (2 self)
 Add to MetaCart
(Show Context)
Kernel methods are among the most popular techniques in machine learning. From a frequentist/discriminative perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a Bayesian/generative perspective they are the key in the context of Gaussian processes, where the kernel function is also known as the covariance function. Traditionally, kernel methods have been used in supervised learning problem with scalar outputs and indeed there has been a considerable amount of work devoted to designing and learning kernels. More recently there has been an increasing interest in methods that deal with multiple outputs, motivated partly by frameworks like multitask learning. In this paper, we review different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and functional methods.
Computationally efficient convolved multiple output gaussian processes
 Journal of Machine Learning Research
"... Recently there has been an increasing interest in regression methods that deal with multiple outputs. This has been motivated partly by frameworks like multitask learning, multisensor networks or structured output data. From a Gaussian processes perspective, the problem reduces to specifying an appr ..."
Abstract

Cited by 27 (2 self)
 Add to MetaCart
(Show Context)
Recently there has been an increasing interest in regression methods that deal with multiple outputs. This has been motivated partly by frameworks like multitask learning, multisensor networks or structured output data. From a Gaussian processes perspective, the problem reduces to specifying an appropriate covariance function that, whilst being positive semidefinite, captures the dependencies between all the data points and across all the outputs. One approach to account for nontrivial correlations between outputs employs convolution processes. Under a latent function interpretation of the convolution transform we establish dependencies between output variables. The main drawbacks of this approach are the associated computational and storage demands. In this paper we address these issues. We present different efficient approximations for dependent output Gaussian processes constructed through the convolution formalism. We exploit the conditional independencies present naturally in the model. This leads to a form of the covariance similar in spirit to the so called PITC and FITC approximations for a single output. We show experimental results with synthetic and real data, in particular, we show results in school exams score prediction, pollution prediction and gene expression data.
Efficient multioutput Gaussian processes through variational inducing kernels
 In JMLR: W&CP 9
, 2010
"... Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing su ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way of constructing such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Álvarez and Lawrence recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential nonsmooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias (2009) to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series. 1
Switched Latent Force Models for Movement Segmentation
"... Latent force models encode the interaction between multiple related dynamical systems in the form of a kernel or covariance function. Each variable to be modeled is represented as the output of a differential equation and each differential equation is driven by a weighted sum of latent functions wit ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
(Show Context)
Latent force models encode the interaction between multiple related dynamical systems in the form of a kernel or covariance function. Each variable to be modeled is represented as the output of a differential equation and each differential equation is driven by a weighted sum of latent functions with uncertainty given by a Gaussian process prior. In this paper we consider employing the latent force model framework for the problem of determining robot motor primitives. To deal with discontinuities in the dynamical systems or the latent driving force we introduce an extension of the basic latent force model, that switches between different latent functions and potentially different dynamical systems. This creates a versatile representation for robot movements that can capture discrete changes and nonlinearities in the dynamics. We give illustrative examples on both synthetic data and for striking movements recorded using a Barrett WAM robot as haptic input device. Our inspiration is robot motor primitives, but we expect our model to have wide application for dynamical systems including models for human motion capture data and systems biology. 1
A Constrained Latent Variable Model ∗
"... Latent variable models provide valuable compact representations for learning and inference in many computer vision tasks. However, most existing models cannot directly encode prior knowledge about the specific problem at hand. In this paper, we introduce a constrained latent variable model whose gen ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
(Show Context)
Latent variable models provide valuable compact representations for learning and inference in many computer vision tasks. However, most existing models cannot directly encode prior knowledge about the specific problem at hand. In this paper, we introduce a constrained latent variable model whose generated output inherently accounts for such knowledge. To this end, we propose an approach that explicitly imposes equality and inequality constraints on the model’s output during learning, thus avoiding the computational burden of having to account for these constraints at inference. Our learning mechanism can exploit nonlinear kernels, while only involving sequential closedform updates of the model parameters. We demonstrate the effectiveness of our constrained latent variable model on the problem of nonrigid 3D reconstruction from monocular images, and show that it yields qualitative and quantitative improvements over several baselines. 1.
Approximate inference in continuous time gaussianjump processes
 Advances in Neural Information Processing Systems 23
, 2010
"... We present a novel approach to inference in conditionally Gaussian continuous time stochastic processes, where the latent process is a Markovian jump process. We first consider the case of jumpdiffusion processes, where the drift of a linear stochastic differential equation can jump at arbitrary ti ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
We present a novel approach to inference in conditionally Gaussian continuous time stochastic processes, where the latent process is a Markovian jump process. We first consider the case of jumpdiffusion processes, where the drift of a linear stochastic differential equation can jump at arbitrary time points. We derive partial differential equations for exact inference and present a very efficient mean field approximation. By introducing a novel lower bound on the free energy, we then generalise our approach to Gaussian processes with arbitrary covariance, such as the nonMarkovian RBF covariance. We present results on both simulated and real data, showing that the approach is very accurate in capturing latent dynamics and can be useful in a number of real data modelling tasks.
Gaussian Processes for Timeseries Modelling
 Philosophical Transactions of the Royal Society (Part A
"... In this paper we offer a gentle introduction to Gaussian processes for timeseries data analysis. The conceptual framework of Bayesian modelling for timeseries data is discussed and the foundations of Bayesian nonparametric modelling presented for Gaussian processes. We discuss how domain knowledge ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
In this paper we offer a gentle introduction to Gaussian processes for timeseries data analysis. The conceptual framework of Bayesian modelling for timeseries data is discussed and the foundations of Bayesian nonparametric modelling presented for Gaussian processes. We discuss how domain knowledge influences design of the Gaussian process models and provide case examples to highlight the approaches.
Linear Operators and Stochastic Partial Differential Equations in Gaussian Process Regression
"... Abstract. In this paper we shall discuss an extension to Gaussian process (GP) regression models, where the measurements are modeled as linear functionals of the underlying GP and the estimation objective is a general linear operator of the process. We shall show how this framework can be used for m ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper we shall discuss an extension to Gaussian process (GP) regression models, where the measurements are modeled as linear functionals of the underlying GP and the estimation objective is a general linear operator of the process. We shall show how this framework can be used for modeling physical processes involved in measurement of the GP and for encoding physical prior information into regression models in form of stochastic partial differential equations (SPDE). We shall also illustrate the practical applicability of the theory in a simulated application. Keywords: Gaussianprocessregression,linearoperator,stochasticpartial differential equation, inverse problem 1
StateSpace Inference for NonLinear Latent Force Models with Application to Satellite Orbit Prediction
"... Latent force models (LFMs) are flexible models that combine mechanistic modelling principles (i.e., physical models) with nonparametric datadriven components. Several key applications of LFMs need nonlinearities, which results in analytically intractable inference. In this work we show how nonline ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
Latent force models (LFMs) are flexible models that combine mechanistic modelling principles (i.e., physical models) with nonparametric datadriven components. Several key applications of LFMs need nonlinearities, which results in analytically intractable inference. In this work we show how nonlinear LFMs can be represented as nonlinear white noise driven statespace models and present an efficient nonlinear Kalman filtering and smoothing based method for approximate state and parameter inference. We illustrate the performance of the proposed methodology via two simulated examples, and apply it to a realworld problem of longterm prediction of GPS satellite orbits. 1.