Results 1 
4 of
4
Hidden Markov Model Approach to Skill Learning and Its Application to Telerobotics
, 1993
"... In this paper, we discuss the problem of how human skill can be represented as parametric model using a hidden Markov model (HMM), and how a HMMbased skill model can be used to learn human skill. HMM is feasible to characterize two stochastic processes  measurable action and immeasurable mental st ..."
Abstract

Cited by 59 (4 self)
 Add to MetaCart
In this paper, we discuss the problem of how human skill can be represented as parametric model using a hidden Markov model (HMM), and how a HMMbased skill model can be used to learn human skill. HMM is feasible to characterize two stochastic processes  measurable action and immeasurable mental states  which oze involved in the skill learning. We formul ted the learning problem as a multidimensional HMM and developed programming system which serve as a skill learning testbed for a variety of applications. Based on 'the most likely performance" criterion, we can select the best action sequence from a]l previously measured action data by modeling the skill as HMM. This selection process can be updated in realtime by feeding new action data and modifying HMM parameters. We address the imp]emcntatlon of the proposed method in a teleoperationcontrolled space robot. An operator specifies the control command by a hand controller for the task of exchanging Orbit Replaceable Unit, and the robot learns the operation skill by selecting the sequence which represents the most likely performance of the operator. The skill is learned in Caxtesian space, joint space, and velocity domain. The experimental results demonstrate the feasibility of the proposed method in learning human skill and teleopertion control. The learning is significant in eliminating sluggish motion and correcting the motion command which the operator mistakenly generates.
Monte Carlo Hidden Markov Models: Learning NonParametric Models of Partially Observable Stochastic Processes
 In Proc. of the International Conference on Machine Learning (ICML
, 1999
"... We present a learning algorithm for nonparametric hidden Markov models with continuous state and observation spaces. All necessary probability densities are approximated using samples, along with density trees generated from such samples. AMonte Carlo version of BaumWelch (EM) is employed to learn ..."
Abstract

Cited by 22 (6 self)
 Add to MetaCart
We present a learning algorithm for nonparametric hidden Markov models with continuous state and observation spaces. All necessary probability densities are approximated using samples, along with density trees generated from such samples. AMonte Carlo version of BaumWelch (EM) is employed to learn models from data. Regularization during learning is achieved using an exponential shrinking technique. The shrinkage factor, which determines the effective capacity of the learning algorithm, is annealed down over multiple iterations of BaumWelch, and early stopping is applied to select the right model. Once trained, Monte Carlo HMMs can be run in an anytime fashion. We prove that under mild assumptions, Monte Carlo Hidden Markov Models converge to a local maximum in likelihood space, just like conventional HMMs. In addition, we provide empirical results obtained in a gesture recognition domain. 1 Introduction Hidden Markov models (HMMs) [27] have been applied successfully to a large rang...
A Monadic Probabilistic Language
 In Proceedings of the 2003 ACM SIGPLAN international workshop on Types in languages design and implementation
, 2003
"... Motivated by many practical applications that have to compute in the presence of uncertainty, we propose a monadic probabilistic language based upon the mathematical notion of sampling function. Our language provides a unified representation scheme for probability distributions, enjoys rich expressi ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
Motivated by many practical applications that have to compute in the presence of uncertainty, we propose a monadic probabilistic language based upon the mathematical notion of sampling function. Our language provides a unified representation scheme for probability distributions, enjoys rich expressiveness, and o#ers high versatility in encoding probability distributions. We also develop a novel style of operational semantics called a horizontal operational semantics, under which an evaluation returns not a single outcome but multiple outcomes. We have preliminary evidence that the horizontal operational semantics improves the ordinary operational semantics with respect to both execution time and accuracy in representing probability distributions.
Monte Carlo Hidden Markov Models
, 1998
"... We present a learning algorithm for hidden Markov models with continuous state and observation spaces. All necessary probability density functions are approximated using samples, along with density trees generated from such samples. A Monte Carlo version of BaumWelch (EM) is employed to learn model ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
We present a learning algorithm for hidden Markov models with continuous state and observation spaces. All necessary probability density functions are approximated using samples, along with density trees generated from such samples. A Monte Carlo version of BaumWelch (EM) is employed to learn models from data, just as in regular HMM learning. Regularization during learning is obtained using an exponential shrinking technique. The shrinkage factor, which determines the effective capacity of the learning algorithm, is annealed down over multiple iterations of BaumWelch, and early stopping is applied to select the right model. We prove that under mild assumptions, Monte Carlo Hidden Markov Models converge to a local maximum in likelihood space, just like conventional HMMs. In addition, we provide empirical results obtained in a gesture recognition domain, which illustrate the appropriateness of the approach in practice.