Results 1  10
of
23
EQUIENERGY SAMPLER WITH APPLICATIONS IN STATISTICAL INFERENCE AND STATISTICAL MECHANICS
, 2006
"... We introduce a new sampling algorithm, the equienergy sampler, for efficient statistical sampling and estimation. Complementary to the widely used temperaturedomain methods, the equienergy sampler, utilizing the temperature–energy duality, targets the energy directly. The focus on the energy func ..."
Abstract

Cited by 50 (4 self)
 Add to MetaCart
We introduce a new sampling algorithm, the equienergy sampler, for efficient statistical sampling and estimation. Complementary to the widely used temperaturedomain methods, the equienergy sampler, utilizing the temperature–energy duality, targets the energy directly. The focus on the energy function not only facilitates efficient sampling, but also provides a powerful means for statistical estimation, for example, the calculation of the density of states and microcanonical averages in statistical mechanics. The equienergy sampler is applied to a variety of problems, including exponential regression in statistics, motif sampling in computational biology and protein folding in biophysics.
Particle filters for partially observed diffusions
, 2006
"... In this paper we introduce novel particle filters for a class of partiallyobserved continuoustime dynamic models where the signal is given by a multivariate diffusion process. We consider a variety of observation schemes, including diffusion observed with error, observation of a subset of the comp ..."
Abstract

Cited by 36 (8 self)
 Add to MetaCart
(Show Context)
In this paper we introduce novel particle filters for a class of partiallyobserved continuoustime dynamic models where the signal is given by a multivariate diffusion process. We consider a variety of observation schemes, including diffusion observed with error, observation of a subset of the components of the multivariate diffusion and arrival times of a Poisson process whose intensity is a known function of the diffusion (Cox process). Unlike available methods, our particle filters do not require approximations of the transition and/or the observation density using timediscretisations. Instead, they build on recent methodology for the exact simulation of diffusion process and the unbiased estimation of the transition density as described in the recent article Beskos et al. (2005c). In particular, we require the Generalised Poisson Estimator, which is a substantial generalisation of the Poisson Estimator (Beskos et al., 2005c), and it is introduced in this paper. Thus, our filters avoid the systematic biases caused by timediscretisations and they have significant computational advantages over alternative continuoustime filters. These advantages are supported by a central limit theorem which is established in this paper. Keywords: Continuoustime filtering, Exact Algorithm, Central Limit Theorem, Cox Process 1
Time series analysis via mechanistic models. In review; prepublished at arxiv.org/abs/0802.0021
, 2008
"... The purpose of time series analysis via mechanistic models is to reconcile the known or hypothesized structure of a dynamical system with observations collected over time. We develop a framework for constructing nonlinear mechanistic models and carrying out inference. Our framework permits the consi ..."
Abstract

Cited by 36 (10 self)
 Add to MetaCart
The purpose of time series analysis via mechanistic models is to reconcile the known or hypothesized structure of a dynamical system with observations collected over time. We develop a framework for constructing nonlinear mechanistic models and carrying out inference. Our framework permits the consideration of implicit dynamic models, meaning statistical models for stochastic dynamical systems which are specified by a simulation algorithm to generate sample paths. Inference procedures that operate on implicit models are said to have the plugandplay property. Our work builds on recently developed plugandplay inference methodology for partially observed Markov models. We introduce a class of implicitly specified Markov chains with stochastic transition rates, and we demonstrate its applicability to open problems in statistical inference for biological systems. As one example, these models are shown to give a fresh perspective on measles transmission dynamics. As a second example, we present a mechanistic analysis of cholera incidence data, involving interaction between two competing strains of the pathogen Vibrio cholerae. 1. Introduction. A
Stochastic modeling in nanoscale biophysics: subdiffusion within proteins. Annuals of Applied Statistics
, 2008
"... Advances in nanotechnology have allowed scientists to study biological processes on an unprecedented nanoscale moleculebymolecule basis, opening the door to addressing many important biological problems. A phenomenon observed in recent nanoscale singlemolecule biophysics experiments is subdiffusi ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
Advances in nanotechnology have allowed scientists to study biological processes on an unprecedented nanoscale moleculebymolecule basis, opening the door to addressing many important biological problems. A phenomenon observed in recent nanoscale singlemolecule biophysics experiments is subdiffusion, which largely departs from the classical Brownian diffusion theory. In this paper, by incorporating fractional Gaussian noise into the generalized Langevin equation, we formulate a model to describe subdiffusion. We conduct a detailed analysis of the model, including (i) a spectral analysis of the stochastic integrodifferential equations introduced in the model and (ii) a microscopic derivation of the model from a system of interacting particles. In addition to its analytical tractability and clear physical underpinning, the model is capable of explaining data collected in fluorescence studies on single protein molecules. Excellent agreement between the model prediction and the singlemolecule experimental
The random walk Metropolis: linking theory and practice through a case study.
"... Summary: The random walk Metropolis (RWM) is one of the most common Markov Chain Monte Carlo algorithms in practical use today. Its theoretical properties have been extensively explored for certain classes of target, and a number of results with important practical implications have been derived. Th ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
(Show Context)
Summary: The random walk Metropolis (RWM) is one of the most common Markov Chain Monte Carlo algorithms in practical use today. Its theoretical properties have been extensively explored for certain classes of target, and a number of results with important practical implications have been derived. This article draws together a selection of new and existing key results and concepts and describes their implications. The impact of each new idea on algorithm efficiency is demonstrated for the practical example of the Markov modulated Poisson process (MMPP). A reparameterisation of the MMPP which leads to a highly efficient RWM within Gibbs algorithm in certain circumstances is also developed.
Stochastic mechanochemical kinetics of molecular motors: A multidisciplinary enterprise from a physicist's perspective
 Physics ReportsReview Section of Physics Letters
, 2013
"... A molecular motor is made of either a single macromolecule or a macromolecular complex. Just like their macroscopic counterparts, molecular motors “transduce ” input energy into mechanical work. All the nanomotors considered here operate under isothermal conditions far from equilibrium. Moreover, ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
A molecular motor is made of either a single macromolecule or a macromolecular complex. Just like their macroscopic counterparts, molecular motors “transduce ” input energy into mechanical work. All the nanomotors considered here operate under isothermal conditions far from equilibrium. Moreover, one of the possible mechanisms of energy transduction, called Brownian ratchet, does not even have any macroscopic counterpart. But, molecular motor is not synonymous with Brownian ratchet; a large number of molecular motors execute a noisy power stroke, rather than operating as Brownian ratchet. We review not only the structural design and stochastic kinetics of individual single motors, but also their coordination, cooperation and competition as well as the assembly of multimodule motors in various intracellular kinetic processes. Although all the motors considered here execute mechanical movements, efficiency and power output are not necessarily good measures of performance of some motors. Among the intracellular nanomotors, we consider the porters, sliders and rowers, pistons and hooks, exporters, importers, packers and movers as well as those that also synthesize, manipulate and degrade “macromolecules of life”. We review mostly the quantitative models for the kinetics of these motors. We also describe several of those motordriven intracellular stochastic processes for which quantitative models are yet to be developed. In part I, we discuss mainly the methodology and the generic models of various important classes of molecular motors. In part II, we review many specific examples emphasizing the unity of the basic mechanisms as well as diversity of operations arising from the differences in their detailed structure and kinetics. Multidisciplinary research is presented here from the perspective of physicists.
A methodological framework for Monte Carlo probabilistic inference for diffusion processes
 In Inference and Learning in Dynamic Models
, 2010
"... The methodological framework developed and reviewed in this article concerns the unbiased Monte Carlo estimation of the transition density of a diffusion process, and the exact simulation of diffusion processes. The former relates to auxiliary variable methods, and it builds on a rich generic Monte ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
The methodological framework developed and reviewed in this article concerns the unbiased Monte Carlo estimation of the transition density of a diffusion process, and the exact simulation of diffusion processes. The former relates to auxiliary variable methods, and it builds on a rich generic Monte Carlo machinery of unbiased estimation and simulation of infinite series expansions which relates to techniques used in diverse scientific areas such as population genetics and operational research. The latter is a recent significant advance in the numerics for diffusions, it is based on the socalled WienerPoisson factorization of the diffusion measure, and it has interesting connections to exact simulation of killing times for the Brownian motion and interacting particle systems, which are uncovered in this article. A concrete application to probabilistic inference for diffusion processes is presented by considering the continuousdiscrete nonlinear filtering problem. 2
Stochastic Networks in Nanoscale Biophysics: Modeling Enzymatic Reaction of a Single Protein
"... Advances in nanotechnology enable scientists for the first time to study biological processes on a nanoscale moleculebymolecule basis. A surprising discovery from recent nanoscale singlemolecule biophysics experiments is that biological reactions involving enzymes behave fundamentally differently ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Advances in nanotechnology enable scientists for the first time to study biological processes on a nanoscale moleculebymolecule basis. A surprising discovery from recent nanoscale singlemolecule biophysics experiments is that biological reactions involving enzymes behave fundamentally differently from what classical theory predicts. In this article we introduce a stochastic network model to explain the experimental puzzles (by modeling enzymatic reactions as a stochastic network connected by different enzyme conformations). Detailed analyses of the model, including analyses of the firstpassagetime distributions and goodness of fit, show that the stochastic network model is capable of explaining the experimental surprises. The model is analytically tractable and closely fits experimental data. The biological/chemical meaning of the model is discussed.
Bayesian approach to the determination of the kinetic parameters of DNA hairpins under tension
 Journal of Nonlinear Mathematical Physics
, 2011
"... In this paper we propose a Bayesian scheme for the determination of the unfolding and refolding kinetic rates of DNA hairpins under tension. This method is based on the hypothesis that the unfoldingrefolding dynamics is well described by a Markov Chain. The results from the Bayesian method are cont ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
In this paper we propose a Bayesian scheme for the determination of the unfolding and refolding kinetic rates of DNA hairpins under tension. This method is based on the hypothesis that the unfoldingrefolding dynamics is well described by a Markov Chain. The results from the Bayesian method are contrasted to widely used techniques and good agreement is found. This work can be seen as a validation of the standard techniques from a statistical point of view.
A New Bayesian Unit Root Test in Stochastic Volatility Models
, 2010
"... A new posterior odds analysis is proposed to test for a unit root in volatility dynamics in the context of stochastic volatility models. This analysis extends the Bayesian unit root test of So and Li (1999, Journal of Business Economic Statistics) in two important ways. First, a numerically more s ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
A new posterior odds analysis is proposed to test for a unit root in volatility dynamics in the context of stochastic volatility models. This analysis extends the Bayesian unit root test of So and Li (1999, Journal of Business Economic Statistics) in two important ways. First, a numerically more stable algorithm is introduced to compute the Bayes factor, taking into account the special structure of the competing models. Owing to its numerical stability, the algorithm overcomes the problem of diverged “size ” in the marginal likelihood approach. Second, to improve the “power ” of the unit root test, a mixed prior specification with random weights is employed. It is shown that the posterior odds ratio is the byproduct of Bayesian estimation and can be easily computed by MCMC methods. A simulation study examines the “size” and “power” performances of the new method. An empirical study, based on time series data covering the subprime crisis, reveals some interesting results.