Results 1 
5 of
5
Inducing Features of Random Fields
 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
, 1997
"... We present a technique for constructing random fields from a set of training samples. The learning paradigm builds increasingly complex fields by allowing potential functions, or features, that are supported by increasingly large subgraphs. Each feature has a weight that is trained by minimizing the ..."
Abstract

Cited by 556 (13 self)
 Add to MetaCart
We present a technique for constructing random fields from a set of training samples. The learning paradigm builds increasingly complex fields by allowing potential functions, or features, that are supported by increasingly large subgraphs. Each feature has a weight that is trained by minimizing the KullbackLeibler divergence between the model and the empirical distribution of the training data. A greedy algorithm determines how features are incrementally added to the field and an iterative scaling algorithm is used to estimate the optimal values of the weights. The random field models and techniques introduced in this paper differ from those common to much of the computer vision literature in that the underlying random fields are nonMarkovian and have a large number of parameters that must be estimated. Relations to other learning approaches, including decision trees, are given. As a demonstration of the method, we describe its application to the problem of automatic word classifica...
Markov random fields and images
 CWI Quarterly
, 1998
"... At the intersection of statistical physics and probability theory, Markov random elds and Gibbs distributions have emerged in the early eighties as powerful tools for modeling images and coping with highdimensional inverse problems from lowlevel vision. Since then, they have been used in many studi ..."
Abstract

Cited by 25 (0 self)
 Add to MetaCart
At the intersection of statistical physics and probability theory, Markov random elds and Gibbs distributions have emerged in the early eighties as powerful tools for modeling images and coping with highdimensional inverse problems from lowlevel vision. Since then, they have been used in many studies from the image processing and computer vision community. Abrief and simple introduction to the basics of the domain is proposed. 1. Introduction and
Learning in Gibbsian Fields: How Accurate and How Fast Can It Be?
, 2002
"... Gibbsian elds or Markov random elds are widely used in Bayesian image analysis, but learning Gibbs models is computationally expensive. The computational complexity is pronounced by the recent minimax entropy (FRAME) models which use large neighborhoods and hundreds of parameters[22]. In this pape ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
Gibbsian elds or Markov random elds are widely used in Bayesian image analysis, but learning Gibbs models is computationally expensive. The computational complexity is pronounced by the recent minimax entropy (FRAME) models which use large neighborhoods and hundreds of parameters[22]. In this paper, we present a common framework for learning Gibbs models. We identify two key factors that determine the accuracy and speed of learning Gibbs models: The eciency of likelihood functions and the variance in approximating partition functions using Monte Carlo integration. We propose three new algorithms. In particular, we are interested in a maximum satellite likelihood estimator, which makes use of a set of precomputed Gibbs models called \satellites" to approximate likelihood functions. This algorithm can approximately estimate the minimax entropy model for textures in seconds in a HP workstation. The performances of various learning algorithms are compared in our experiments.
TimeInvariance Estimating Equations
 Bernoulli
, 1995
"... We describe a general method for deriving estimators of the parameter of a statistical model, with particular relevance to highly structured stochastic systems such as spatial random processes and `graphical' conditional independence models. The method is based on representing the stochastic mo ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
We describe a general method for deriving estimators of the parameter of a statistical model, with particular relevance to highly structured stochastic systems such as spatial random processes and `graphical' conditional independence models. The method is based on representing the stochastic model as the equilibrium distribution of a Markov process Y = (Y t ; t ? 0) where the discrete or continuous `time' index t is to be understood as a fictional extra dimension added to the original setting. The parameter estimate b ` is obtained by equating to zero the generator of Y applied to a suitable statistic and evaluated at the data x. This produces an unbiased estimating equation for `. Natural special cases include the reduced sample estimator in survival analysis, the maximum pseudolikelihood estimator for random fields and for point processes, the TakacsFiksel method for point processes, `variational' estimators for random fields and multivariate distributions, and many standard esti...