Results 1 
4 of
4
MM algorithms for generalized BradleyTerry models
 The Annals of Statistics
, 2004
"... The Bradley–Terry model for paired comparisons is a simple and muchstudied means to describe the probabilities of the possible outcomes when individuals are judged against one another in pairs. Among the many studies of the model in the past 75 years, numerous authors have generalized it in several ..."
Abstract

Cited by 33 (2 self)
 Add to MetaCart
The Bradley–Terry model for paired comparisons is a simple and muchstudied means to describe the probabilities of the possible outcomes when individuals are judged against one another in pairs. Among the many studies of the model in the past 75 years, numerous authors have generalized it in several directions, sometimes providing iterative algorithms for obtaining maximum likelihood estimates for the generalizations. Building on a theory of algorithms known by the initials MM, for minorization–maximization, this paper presents a powerful technique for producing iterative maximum likelihood estimation algorithms for a wide class of generalizations of the Bradley–Terry model. While algorithms for problems of this type have tended to be custombuilt in the literature, the techniques in this paper enable their mass production. Simple conditions are stated that guarantee that each algorithm described will produce a sequence that converges to the unique maximum likelihood estimator. Several of the algorithms and convergence results herein are new. 1. Introduction. In
Quantile regression via an MM algorithm
 J. Comput. Graphical Stat
, 2000
"... Quantile regression is an increasingly popular method for estimating the quantiles of a distribution conditional on the values of covariates. Regression quantiles are robust against the influence of outliers, and taken several at a time, they give a more complete picture of the conditional distribut ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
Quantile regression is an increasingly popular method for estimating the quantiles of a distribution conditional on the values of covariates. Regression quantiles are robust against the influence of outliers, and taken several at a time, they give a more complete picture of the conditional distribution than a single estimate of the center. The current paper first presents an iterative algorithm for finding sample quantiles without sorting and then explores a generalization of the algorithm to nonlinear quantile regression. Our quantile regression algorithm is termed an MM, or MajorizeMinimize, algorithm because it entails majorizing the objective function by a quadratic function followed by minimizing that quadratic. The algorithm is conceptually simple and easy to code, and our numerical tests suggest that it is computationally competitive with a recent interior point algorithm for most problems.
An Optimization Transfer Algorithm for Quantile Regression
, 1998
"... The q quantiles of an integrable random variable solve a minimization problem involving a certain expectation. This optimality principle suggests an algorithm for finding a sample quantile without sorting. The current paper explores generalizations of the algorithm to nonlinear quantile regression. ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
The q quantiles of an integrable random variable solve a minimization problem involving a certain expectation. This optimality principle suggests an algorithm for finding a sample quantile without sorting. The current paper explores generalizations of the algorithm to nonlinear quantile regression. In particular, we devise a quantile regression algorithm that operates by transferring optimization of the nondifferentiable objective function to a quadratic function. It is conceptually simple and easy to code, and our numerical tests suggest that it is computationally competitive with the interior point algorithm of Koenker and Park (1996).
Computing Estimates in the Proportional Odds Model
, 2000
"... The semiparametric proportional odds model for survival data is useful when mortality rates of different groups converge over time. However, fitting the model by maximum likelihood proves computationally cumbersome for large datasets because the number of parameters exceeds the number of uncensored ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The semiparametric proportional odds model for survival data is useful when mortality rates of different groups converge over time. However, fitting the model by maximum likelihood proves computationally cumbersome for large datasets because the number of parameters exceeds the number of uncensored observations. We present here an alternative to the standard NewtonRaphson method of maximum likelihood estimation. Our algorithm, an example of a minorizationmaximization (MM) algorithm, is guaranteed to converge to the maximum likelihood estimate whenever it exists. For large problems, both the algorithm and its quasiNewton accelerated counterpart outperform NewtonRaphson by more than two orders of magnitude.