Results 1 
4 of
4
Arcsine laws and interval partitions derived from a stable subordinator
 Proc. London Math. Soc
, 1992
"... Le"vy discovered that the fraction of time a standard onedimensional Brownian motion B spends positive before time t has arcsine distribution, both for / a fixed time when B, #0 almost surely, and for / an inverse local time, when B, = 0 almost surely. This identity in distribution is extende ..."
Abstract

Cited by 44 (24 self)
 Add to MetaCart
(Show Context)
Le"vy discovered that the fraction of time a standard onedimensional Brownian motion B spends positive before time t has arcsine distribution, both for / a fixed time when B, #0 almost surely, and for / an inverse local time, when B, = 0 almost surely. This identity in distribution is extended from the fraction of time spent positive to a large collection of functionals derived from the lengths and signs of excursions of B away from 0. Similar identities in distribution are associated with any process whose zero set is the range of a stable subordinator, for instance a Bessel process of dimension d for 1.
A Stochastic Memoizer for Sequence Data
"... We propose an unboundeddepth, hierarchical, Bayesian nonparametric model for discrete sequence data. This model can be estimated from a single training sequence, yet shares statistical strength between subsequent symbol predictive distributions in such a way that predictive performance generalizes ..."
Abstract

Cited by 20 (7 self)
 Add to MetaCart
(Show Context)
We propose an unboundeddepth, hierarchical, Bayesian nonparametric model for discrete sequence data. This model can be estimated from a single training sequence, yet shares statistical strength between subsequent symbol predictive distributions in such a way that predictive performance generalizes well. The model builds on a specific parameterization of an unboundeddepth hierarchical PitmanYor process. We introduce analytic marginalization steps (using coagulation operators) to reduce this model to one that can be represented in time and space linear in the length of the training sequence. We show how to perform inference in such a model without truncation approximation and introduce fragmentation operators necessary to do predictive inference. We demonstrate the sequence memoizer by using it as a language model, achieving stateoftheart results. 1.
Indian Buffet Processes with Powerlaw Behavior
"... The Indian buffet process (IBP) is an exchangeable distribution over binary matrices used in Bayesian nonparametric featural models. In this paper we propose a threeparameter generalization of the IBP exhibiting powerlaw behavior. We achieve this by generalizing the beta process (the de Finetti me ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
The Indian buffet process (IBP) is an exchangeable distribution over binary matrices used in Bayesian nonparametric featural models. In this paper we propose a threeparameter generalization of the IBP exhibiting powerlaw behavior. We achieve this by generalizing the beta process (the de Finetti measure of the IBP) to the stablebeta process and deriving the IBP corresponding to it. We find interesting relationships between the stablebeta process and the PitmanYor process (another stochastic process used in Bayesian nonparametric models with interesting powerlaw properties). We derive a stickbreaking construction for the stablebeta process, and find that our powerlaw IBP is a good model for word occurrences in document corpora. 1
ELECTRONIC COMMUNICATIONS in PROBABILITY Limits of renewal processes and PitmanYor distribution
"... We consider a renewal process with regularly varying stationary and weakly dependent steps, and prove that the steps made before a given time t, satisfy an interesting invariance principle. Namely, together with the age of the renewal process at time t, they converge after scaling to the Pitman–Yor ..."
Abstract
 Add to MetaCart
(Show Context)
We consider a renewal process with regularly varying stationary and weakly dependent steps, and prove that the steps made before a given time t, satisfy an interesting invariance principle. Namely, together with the age of the renewal process at time t, they converge after scaling to the Pitman–Yor distribution. We further discuss how our results extend the classical Dynkin–Lamperti theorem.