Results 1 
8 of
8
Sequential Importance Sampling for Nonparametric Bayes Models: The Next Generation
 Journal of Statistics
, 1998
"... this paper, we exploit the similarities between the Gibbs sampler and the SIS, bringing over the improvements for Gibbs sampling algorithms to the SIS setting for nonparametric Bayes problems. These improvements result in an improved sampler and help satisfy questions of Diaconis (1995) pertaining t ..."
Abstract

Cited by 67 (5 self)
 Add to MetaCart
this paper, we exploit the similarities between the Gibbs sampler and the SIS, bringing over the improvements for Gibbs sampling algorithms to the SIS setting for nonparametric Bayes problems. These improvements result in an improved sampler and help satisfy questions of Diaconis (1995) pertaining to convergence. Such an effort can see wide applications in many other problems related to dynamic systems where the SIS is useful (Berzuini et al. 1996; Liu and Chen 1996). Section 2 describes the specific model that we consider. For illustration we focus discussion on the betabinomial model, although the methods are applicable to other conjugate families. In Section 3, we describe the first generation of the SIS and Gibbs sampler in this context, and present the necessary conditional distributions upon which the techniques rely. Section 4 describes the alterations that create the second generation techniques, and provides specific algorithms for the model we consider. Section 5 presents a comparison of the techniques on a large set of data. Section 6 provides theory that ensures the proposed methods work and that is generally applicable to many other problems using importance sampling approaches. The final section presents discussion. 2 The Model
Some applications of generalized FFTs
 In Proceedings of DIMACS Workshop in Groups and Computation
, 1997
"... . Generalized FFTs are efficient algorithms for computing a Fourier transform of a function defined on finite group, or a bandlimited function defined on a compact group. The development of such algorithms has been accompanied and motivated by a growing number of both potential and realized applicat ..."
Abstract

Cited by 30 (5 self)
 Add to MetaCart
. Generalized FFTs are efficient algorithms for computing a Fourier transform of a function defined on finite group, or a bandlimited function defined on a compact group. The development of such algorithms has been accompanied and motivated by a growing number of both potential and realized applications. This paper will attempt to survey some of these applications. Appendices include some more detailed examples. 1. A brief history The now "classical" Fast Fourier Transform (FFT) has a long and interesting history. Originally discovered by Gauss, and later made famous after being rediscovered by Cooley and Tukey [21], it may be viewed as an algorithm which efficiently computes the discrete Fourier transform or DFT. In between Gauss and CooleyTukey others developed special cases of the algorithm, usually motivated by the need to make efficient data analysis of one sort or another. To cite but a few examples, Gauss was interested in efficiently interpolating the orbits of asteroids [43...
Separation of Variables and the Computation of Fourier Transforms on Finite Groups, I
 I. J. OF THE AMER. MATH. SOC
, 1997
"... This paper introduces new techniques for the efficient computation of a Fourier transform on a finite group. We present a divide and conquer approach to the computation. The divide aspect uses factorizations of group elements to reduce the matrix sum of products for the Fourier transform to simpler ..."
Abstract

Cited by 17 (7 self)
 Add to MetaCart
This paper introduces new techniques for the efficient computation of a Fourier transform on a finite group. We present a divide and conquer approach to the computation. The divide aspect uses factorizations of group elements to reduce the matrix sum of products for the Fourier transform to simpler sums of products. This is the separation of variables algorithm. The conquer aspect is the final computation of matrix products which we perform efficiently using a special form of the matrices. This form arises from the use of subgroupadapted representations and their structure when evaluated at elements which lie in the centralizers of subgroups in a subgroup chain. We present a detailed analysis of the matrix multiplications arising in the calculation and obtain easytouse upper bounds for the complexity of our algorithm in terms of representation theoretic data for the group of interest. Our algorithm encompasses many of the known examples of fast Fourier transforms. We recover the b...
SequentiallyAllocated MergeSplit Sampler for Conjugate and Nonconjugate Dirichlet Process Mixture Models
, 2005
"... This paper proposes a new efficient mergesplit sampler for both conjugate and nonconjugate Dirichlet process mixture (DPM) models. These Bayesian nonparametric models are usually fit using Markov chain Monte Carlo (MCMC) or sequential importance sampling (SIS). The latest generation of Gibbs and Gi ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
This paper proposes a new efficient mergesplit sampler for both conjugate and nonconjugate Dirichlet process mixture (DPM) models. These Bayesian nonparametric models are usually fit using Markov chain Monte Carlo (MCMC) or sequential importance sampling (SIS). The latest generation of Gibbs and Gibbslike samplers for both conjugate and nonconjugate DPM models effectively update the model parameters, but can have difficulty in updating the clustering of the data. To overcome this deficiency, mergesplit samplers have been developed, but until now these have been limited to conjugate or conditionallyconjugate DPM models. This paper proposes a new MCMC sampler, called the sequentiallyallocated mergesplit (SAMS) sampler. The sampler borrows ideas from sequential importance sampling. Splits are proposed by sequentially allocating observations to one of two split components using allocation probabilities that condition on previously allocated data. The SAMS sampler is applicable to general nonconjugate DPM models as well as conjugate models. Further, the proposed sampler is substantially more efficient than existing conjugate and nonconjugate samplers.
INVARIANCE CONDITIONS FOR RANDOM CURVATURE MODELS
"... Abstract. A class of probability models is introduced with the objective of representing certain properties of the geometric optics of the human eye. Astigmatic probability laws are those in which the extreme curvature values in the anterior corneal surface, measured at circularly arranged and equal ..."
Abstract

Cited by 9 (9 self)
 Add to MetaCart
Abstract. A class of probability models is introduced with the objective of representing certain properties of the geometric optics of the human eye. Astigmatic probability laws are those in which the extreme curvature values in the anterior corneal surface, measured at circularly arranged and equally spaced locations, are displaced by an approximate 90 deg angular separation. The relationship between the symmetry invariance of these probability laws for curvature data and probability laws for the ranking permutations associated with the ordering of these data is obtained. A distinction is made between the condition in which the components of the curvature ensemble are represented as real numbers from that in which these curvatures are colorcoded and take value on a finite totally ordered set. A constructive principle for astigmatic laws is outlined based on algebraic arguments for the analysis of structured data. 1.
Efficient Computation of Isotypic Projections for the Symmetric Group
, 1993
"... . Spectral analysis on the symmetric group Sn calls for computing projections of functions defined on Sn and its homogeneous spaces, onto invariant subspaces. In particular, for the analysis of partially ranked data, the appropriate homogeneous spaces are given as quotients by Young subgroups. Here ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
. Spectral analysis on the symmetric group Sn calls for computing projections of functions defined on Sn and its homogeneous spaces, onto invariant subspaces. In particular, for the analysis of partially ranked data, the appropriate homogeneous spaces are given as quotients by Young subgroups. Here the naive character theoretic approach to computing projections requires O(n \Delta n!) operations. In this paper two types of polynomial time algorithms (quadratic in the size of the homogeneous space) are presented for partially ranked data. The first approachmakes use of a more careful organization of the character theoretic computation and is applicable to arbitrary finite groups and their homogeneous spaces. The second approach makes use of the techniques of the combinatorial Radon transform. 1. Introduction Let G be a finite group acting transitively on a set X. Often X is called a homogeneous space for G. Let L(X) denote the vector space of complexvalued functions on X. Then L(X) na...
On a nonparametric recursive estimator of the mixing distribution
 Sankhyā Ser. A
, 2002
"... SUMMARY. Routinely in statistical applications hierarchical models arise in which unobserved random effects contribute to heterogeneity amongst sampling units. An easily computable, smooth nonparametric estimate of the underlying mixing distribution can be derived as an approximate nonparametric Bay ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
SUMMARY. Routinely in statistical applications hierarchical models arise in which unobserved random effects contribute to heterogeneity amongst sampling units. An easily computable, smooth nonparametric estimate of the underlying mixing distribution can be derived as an approximate nonparametric Bayes estimate under a Dirichlet process prior. I discuss the recursive estimation algorithm, its consistency properties, and its application in several examples, including its use as a model diagnostic in the analysis of DNA microarray gene expression data. 1. Nonparametric Mixture Models A useful stochastic model to account for heterogeneity amongst experimental units is the nonparametric mixture model. From an unknown distribution F there arise independent and identically distributed but unobserved draws φ1,φ2,...,φn in a set Φ. Given these φi’s, conditionally independent random variables x1,x2,...,xn are observed, where the law of xi given φi is described by a known sampling density p(xφ) with respect to some dominating measure µ on the sample space. This framework covers many examples, and I consider two for illustration: 1. φi ∈ Φ=[0, 1] and xi is binomially distributed with success probability φi and sample size m. I study an application of this model to a wellstudied experiment on thumbtack tossing in which m =9,n = 320 and xi counts the number of times the ith tack lands with its point facing up (Beckett and Diaconis, 1994). Paper received October 2000; revised November 2001. AMS (2000) subject classification. Primary 65C60; secondary 62G07.
Accessed: 17/02/2010 17:03
"... Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract
 Add to MetaCart
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at