## Relative Loss Bounds for On-line Density Estimation with the Exponential Family of Distributions (2000)

### Cached

### Download Links

- [www.cse.ucsc.edu]
- [www.cse.ucsc.edu]
- [www.cse.ucsc.edu]
- DBLP

### Other Repositories/Bibliography

Venue: | MACHINE LEARNING |

Citations: | 115 - 10 self |

### BibTeX

@INPROCEEDINGS{Azoury00relativeloss,

author = {Katy S. Azoury and M. K. Warmuth},

title = {Relative Loss Bounds for On-line Density Estimation with the Exponential Family of Distributions},

booktitle = {MACHINE LEARNING},

year = {2000},

pages = {2001},

publisher = {Morgan Kaufmann}

}

### Years of Citing Articles

### OpenURL

### Abstract

We consider on-line density estimation with a parameterized density from the exponential family. The on-line algorithm receives one example at a time and maintains a parameter that is essentially an average of the past examples. After receiving an example the algorithm incurs a loss, which is the negative loglikelihood of the example with respect to the past parameter of the algorithm. An o-line algorithm can choose the best parameter based on all the examples. We prove bounds on the additional total loss of the on-line algorithm over the total loss of the best o-line parameter. These relative loss bounds hold for an arbitrary sequence of examples. The goal is to design algorithms with the best possible relative loss bounds. We use a Bregman divergence to derive and analyze each algorithm. These divergences are relative entropies between two exponential distributions. We also use our methods to prove relative loss bounds for linear regression.