Results 1  10
of
165,381
/ Hierarchical Counterexamples for DiscreteTime Markov Chains
"... Abstract. In this paper we introduce a novel counterexample generation approach for discretetime Markov chains (DTMCs) with two main advantages: (1) We generate abstract counterexamples, which can be refined in a hierarchical manner. (2) We aim at minimizing the number of states involved in the cou ..."
Abstract

Cited by 11 (8 self)
 Add to MetaCart
Abstract. In this paper we introduce a novel counterexample generation approach for discretetime Markov chains (DTMCs) with two main advantages: (1) We generate abstract counterexamples, which can be refined in a hierarchical manner. (2) We aim at minimizing the number of states involved
Symbolic Counterexample Generation for Discretetime Markov Chains
"... In this paper we investigate the generation of counterexamples for discretetime Markov chains (DTMCs) and PCTL properties. Whereas most available methods use explicit representations for at least some intermediate results, our aim is to develop fully symbolic algorithms. As in most related work, ou ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In this paper we investigate the generation of counterexamples for discretetime Markov chains (DTMCs) and PCTL properties. Whereas most available methods use explicit representations for at least some intermediate results, our aim is to develop fully symbolic algorithms. As in most related work
Reversible jump Markov chain Monte Carlo computation and Bayesian model determination
 Biometrika
, 1995
"... Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some xed standard underlying measure. They have therefore not been available for application to Bayesian model determi ..."
Abstract

Cited by 1330 (24 self)
 Add to MetaCart
Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some xed standard underlying measure. They have therefore not been available for application to Bayesian model
Exact Sampling with Coupled Markov Chains and Applications to Statistical Mechanics
, 1996
"... For many applications it is useful to sample from a finite set of objects in accordance with some particular distribution. One approach is to run an ergodic (i.e., irreducible aperiodic) Markov chain whose stationary distribution is the desired distribution on this set; after the Markov chain has ..."
Abstract

Cited by 548 (13 self)
 Add to MetaCart
For many applications it is useful to sample from a finite set of objects in accordance with some particular distribution. One approach is to run an ergodic (i.e., irreducible aperiodic) Markov chain whose stationary distribution is the desired distribution on this set; after the Markov chain
Finite state Markovchain approximations to univariate and vector autoregressions
 Economics Letters
, 1986
"... The paper develops a procedure for finding a discretevalued Markov chain whose sample paths approximate well those of a vector autoregression. The procedure has applications in those areas of economics, finance, and econometrics where approximate solutions to integral equations are required. 1. ..."
Abstract

Cited by 472 (0 self)
 Add to MetaCart
The paper develops a procedure for finding a discretevalued Markov chain whose sample paths approximate well those of a vector autoregression. The procedure has applications in those areas of economics, finance, and econometrics where approximate solutions to integral equations are required. 1.
Coupled hidden Markov models for complex action recognition
, 1996
"... We present algorithms for coupling and training hidden Markov models (HMMs) to model interacting processes, and demonstrate their superiority to conventional HMMs in a vision task classifying twohanded actions. HMMs are perhaps the most successful framework in perceptual computing for modeling and ..."
Abstract

Cited by 497 (22 self)
 Add to MetaCart
We present algorithms for coupling and training hidden Markov models (HMMs) to model interacting processes, and demonstrate their superiority to conventional HMMs in a vision task classifying twohanded actions. HMMs are perhaps the most successful framework in perceptual computing for modeling
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 800 (26 self)
 Add to MetaCart
all be understood in terms of exact or approximate forms of these variational representations. The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in largescale statistical models.
CONDENSATION  conditional density propagation for visual tracking
 International Journal of Computer Vision
, 1998
"... The problem of tracking curves in dense visual clutter is challenging. Kalman filtering is inadequate because it is based on Gaussian densities which, being unimodal, cannot represent simultaneous alternative hypotheses. The Condensation algorithm uses "factored sampling", previously appli ..."
Abstract

Cited by 1499 (12 self)
 Add to MetaCart
tracking of agile motion. Notwithstanding the use of stochastic methods, the algorithm runs in near realtime. Contents 1 Tracking curves in clutter 2 2 Discretetime propagation of state density 3 3 Factored sampling 6 4 The Condensation algorithm 8 5 Stochastic dynamical models for curve motion 10 6
Dynamic Bayesian Networks: Representation, Inference and Learning
, 2002
"... Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and biosequence analysis, and KFMs have bee ..."
Abstract

Cited by 758 (3 self)
 Add to MetaCart
sequential data.
In particular, the main novel technical contributions of this thesis are as follows: a way of representing
Hierarchical HMMs as DBNs, which enables inference to be done in O(T) time instead of O(T 3), where T is the length of the sequence; an exact smoothing algorithm that takes O(log T
Results 1  10
of
165,381