## An interior-point method for large-scale l1-regularized logistic regression (2007)

### Cached

### Download Links

Venue: | Journal of Machine Learning Research |

Citations: | 152 - 5 self |

### BibTeX

@ARTICLE{Koh07aninterior-point,

author = {Kwangmoo Koh and Seung-jean Kim and Stephen Boyd and Yi Lin},

title = {An interior-point method for large-scale l1-regularized logistic regression},

journal = {Journal of Machine Learning Research},

year = {2007},

volume = {2007}

}

### Years of Citing Articles

### OpenURL

### Abstract

Logistic regression with ℓ1 regularization has been proposed as a promising method for feature selection in classification problems. In this paper we describe an efficient interior-point method for solving large-scale ℓ1-regularized logistic regression problems. Small problems with up to a thousand or so features and examples can be solved in seconds on a PC; medium sized problems, with tens of thousands of features and examples, can be solved in tens of seconds (assuming some sparsity in the data). A variation on the basic method, that uses a preconditioned conjugate gradient method to compute the search step, can solve very large problems, with a million features and examples (e.g., the 20 Newsgroups data set), in a few minutes, on a PC. Using warm-start techniques, a good approximation of the entire regularization path can be computed much more efficiently than by solving a family of problems independently.