Results 1  10
of
62
Modelbased Geostatistics
 Applied Statistics
, 1998
"... Conventional geostatistical methodology solves the problem of predicting the realised value of a linear functional of a Gaussian spatial stochastic process, S(x), based on observations Y i = S(x i ) + Z i at sampling locations x i , where the Z i are mutually independent, zeromean Gaussian random v ..."
Abstract

Cited by 96 (4 self)
 Add to MetaCart
Conventional geostatistical methodology solves the problem of predicting the realised value of a linear functional of a Gaussian spatial stochastic process, S(x), based on observations Y i = S(x i ) + Z i at sampling locations x i , where the Z i are mutually independent, zeromean Gaussian random variables. We describe two spatial applications for which Gaussian distributional assumptions are clearly inappropriate. The first concerns the assessment of residual contamination from nuclear weapons testing on a South Pacific island, in which the sampling method generates spatially indexed Poisson counts conditional on an unobserved spatially varying intensity of radioactivity; we conclude that a coventional geostatistical analysis oversmooths the data and underestimates the spatial extremes of the intensity. The second application provides a description of spatial variation in the risk of campylobacter infections relative to other enteric infections in part of North Lancashire and South C...
Using multivariate adaptive regression splines to predict the distributions of New Zealand's freshwater
"... This paper deals with these observations as records of occurrence, although strictly speaking they are records of capture. We recognise the potential for confounding between detectability, capture method, and environmental relationships, but are generally satisfied that the main trends we model refl ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
This paper deals with these observations as records of occurrence, although strictly speaking they are records of capture. We recognise the potential for confounding between detectability, capture method, and environmental relationships, but are generally satisfied that the main trends we model reflect environmental effects on occurrence. A further step would be to include detectability in the models (e.g. MacKenzie et al., 2002), but this is a complex undertaking. Data were extracted for 15 diadromous species (Table 1) that occurred in the dataset with a capture frequency of 0.5% or above. The anguillids and Rhombosolea retiaria are catadromous, whereas Geotria australis and some retropinnid stocks are anadromous. The remaining species are amphidromous, meaning that adults remain resident in freshwater, but larval fish are carried out to sea where they spend a short period before migrating back to freshwater to grow to adulthood. The scope of freshwater habitat accessible Fig. 1 Sample sites from the New Zealand Freshwater Fish Database used in the analysis (open circles). Only rivers with an annual mean flow >10 m s are shown. Those shown in light grey were excluded from the analysis because of known significant downstream obstructions to fish migration to/from the sea
A survey of Monte Carlo algorithms for maximizing the likelihood of a twostage hierarchical model
, 2001
"... Likelihood inference with hierarchical models is often complicated by the fact that the likelihood function involves intractable integrals. Numerical integration (e.g. quadrature) is an option if the dimension of the integral is low but quickly becomes unreliable as the dimension grows. An alternati ..."
Abstract

Cited by 10 (4 self)
 Add to MetaCart
Likelihood inference with hierarchical models is often complicated by the fact that the likelihood function involves intractable integrals. Numerical integration (e.g. quadrature) is an option if the dimension of the integral is low but quickly becomes unreliable as the dimension grows. An alternative approach is to approximate the intractable integrals using Monte Carlo averages. Several dierent algorithms based on this idea have been proposed. In this paper we discuss the relative merits of simulated maximum likelihood, Monte Carlo EM, Monte Carlo NewtonRaphson and stochastic approximation. Key words and phrases : Eciency, Monte Carlo EM, Monte Carlo NewtonRaphson, Rate of convergence, Simulated maximum likelihood, Stochastic approximation All three authors partially supported by NSF Grant DMS0072827. 1 1
Wavelets in Statistics: Beyond the Standard Assumptions
 Phil. Trans. Roy. Soc. Lond. A
, 1999
"... this paper, attention has been focused on methods that treat coe#cients at least as if they were independent. However, it is intuitively clear that if one coe#cient in the wavelet array is nonzero, then it is more likely #in some appropriate sense# that neighbouring coe#cients will be also. One way ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
this paper, attention has been focused on methods that treat coe#cients at least as if they were independent. However, it is intuitively clear that if one coe#cient in the wavelet array is nonzero, then it is more likely #in some appropriate sense# that neighbouring coe#cients will be also. One way of incorporating this notion is by some form of block thresholding, where coe#cients are considered in neighbouring blocks; see for example Hall et al. #1998# and Cai & Silverman #1998#. An obvious question for future consideration is integrate the ideas of block thresholding and related methods within the range of models and methods considered in this paper.
On discriminative joint density modeling
 In 16th European Conference on Machine Learning (ECML
, 2005
"... Abstract. We study discriminative joint density models, that is, generative models for the joint density p(c, x) learned by maximizing a discriminative cost function, the conditional likelihood. We use the framework to derive generative models for generalized linear models, including logistic regres ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
Abstract. We study discriminative joint density models, that is, generative models for the joint density p(c, x) learned by maximizing a discriminative cost function, the conditional likelihood. We use the framework to derive generative models for generalized linear models, including logistic regression, linear discriminant analysis, and discriminative mixture of unigrams. The benefits of deriving the discriminative models from joint density models are that it is easy to extend the models and interpret the results, and missing data can be treated using justified standard methods. 1
Bayesian Varyingcoefficient Models using Adaptive Regression Splines
, 2000
"... Varying{coecient models provide a exible framework for semi{ and nonparametric generalized regression analysis. We present a fully Bayesian B{spline basis function approach with adaptive knot selection. For each of the unknown regression functions or varying coecients, the number and location of ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Varying{coecient models provide a exible framework for semi{ and nonparametric generalized regression analysis. We present a fully Bayesian B{spline basis function approach with adaptive knot selection. For each of the unknown regression functions or varying coecients, the number and location of knots and the B{spline coecients are estimated simultaneously using reversible jump Markov chain Monte Carlo sampling. The overall procedure can therefore be viewed as a kind of Bayesian model averaging. Although Gaussian responses are covered by the general framework, the method is particularly useful for fundamentally non{Gaussian responses, where less alternatives are available. We illustrate the approach with a thorough application to two data sets analyzed previously in the literature: the kyphosis data set with a binary response and survival data from the Veteran's Administration lung cancer trial. Keywords: B{spline basis; knot selection; non{Gaussian response; non{ and sem...
Statistical Modelling of Time Series Using NonDecimated Wavelet Representations
, 1998
"... This article proposes the use of timeordered nondecimated wavelet or nondecimated wavelet packet coefficients to provide a representation of a time series (explanatory). The resulting representations are then used as variables in a statistical model to provide predictions of another time series (r ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
This article proposes the use of timeordered nondecimated wavelet or nondecimated wavelet packet coefficients to provide a representation of a time series (explanatory). The resulting representations are then used as variables in a statistical model to provide predictions of another time series (response). The statistical model provides valuable information about which components in the explanatory time series drive the response time series. To represent our time series we use a collection of basis functions known as wavelet packets. Our methodology modifies the standard wavelet packets representation by providing an overdetermined representation of shifted wavelet packets where the shifts are not restricted to the usual wavelet grid. We introduce a fast algorithm for carrying out the timeordered nondecimated wavelet packet transform. The shifted wavelet packet bases can represent many classes of time series sparsely. The sparsity provides an effective dimension reduction which en...
Adjust quality scores from alignment and improve sequencing accuracy
 Nucleic Acids Res
, 2004
"... accuracy ..."