Results 1  10
of
19
The Hierarchical Hidden Markov Model: Analysis and Applications
 MACHINE LEARNING
, 1998
"... . We introduce, analyze and demonstrate a recursive hierarchical generalization of the widely used hidden Markov models, which we name Hierarchical Hidden Markov Models (HHMM). Our model is motivated by the complex multiscale structure which appears in many natural sequences, particularly in langua ..."
Abstract

Cited by 307 (3 self)
 Add to MetaCart
. We introduce, analyze and demonstrate a recursive hierarchical generalization of the widely used hidden Markov models, which we name Hierarchical Hidden Markov Models (HHMM). Our model is motivated by the complex multiscale structure which appears in many natural sequences, particularly in language, handwriting and speech. We seek a systematic unsupervised approach to the modeling of such structures. By extendingthe standard forwardbackward(BaumWelch) algorithm, we derive an efficient procedure for estimating the model parameters from unlabeled data. We then use the trained model for automatic hierarchical parsing of observation sequences. We describe two applications of our model and its parameter estimation procedure. In the first application we show how to construct hierarchical models of natural English text. In these models different levels of the hierarchy correspond to structures on different length scales in the text. In the second application we demonstrate how HHMMs can b...
A new look at statespace models for neural data
 Journal of Computational Neuroscience
, 2010
"... State space methods have proven indispensable in neural data analysis. However, common methods for performing inference in statespace models with nonGaussian observations rely on certain approximations which are not always accurate. Here we review direct optimization methods that avoid these appro ..."
Abstract

Cited by 48 (24 self)
 Add to MetaCart
(Show Context)
State space methods have proven indispensable in neural data analysis. However, common methods for performing inference in statespace models with nonGaussian observations rely on certain approximations which are not always accurate. Here we review direct optimization methods that avoid these approximations, but that nonetheless retain the computational efficiency of the approximate methods. We discuss a variety of examples, applying these direct optimization techniques to problems in spike train smoothing, stimulus decoding, parameter estimation, and inference of synaptic properties. Along the way, we point out connections to some related standard statistical methods, including spline smoothing and isotonic regression. Finally, we note that the computational methods reviewed here do not in fact depend on the statespace setting at all; instead, the key property we are exploiting involves the bandedness of certain matrices. We close by discussing some applications of this more general point of view, including Markov chain Monte Carlo methods for neural decoding and efficient estimation of spatiallyvarying firing rates.
Gaussianprocess factor analysis for lowdimensional singletrial analysis of neural population activity
 J Neurophysiol
"... You might find this additional information useful... A corrigendum for this article has been published. It can be found at: ..."
Abstract

Cited by 48 (15 self)
 Add to MetaCart
(Show Context)
You might find this additional information useful... A corrigendum for this article has been published. It can be found at:
Synergy and Redundancy Among Brain Cells of Behaving Monkeys
 Advances in Neural Information Proceedings systems
, 1999
"... Determining the relationship between the activity of a single nerve cell to that of an entire population is a fundamental question that bears on the basic neural computation paradigms. In this paper we apply an information theoretic approach to quantify the level of cooperative activity among cells ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
(Show Context)
Determining the relationship between the activity of a single nerve cell to that of an entire population is a fundamental question that bears on the basic neural computation paradigms. In this paper we apply an information theoretic approach to quantify the level of cooperative activity among cells in a behavioral context. It is possible to discriminate between synergetic activity of the cells vs. redundant activity, depending on the difference between the information they provide when measured jointly and the information they provide independently. We define a synergy value that is positive in the first case and negative in the second and show that the synergy value can be measured by detecting the behavioral mode of the animal from simultaneously recorded activity of the cells. We observe that among cortical cells positive synergy can be found, while cells from the basal ganglia, active during the same task, do not exhibit similar synergetic activity. fitay,tishbyg@cs.huji.ac.il y...
Algorithms for Understanding Motor Cortical Processing and Neural Prosthetic Systems
, 2009
"... ii ..."
(Show Context)
SpatioTemporal Clustering of Firing Rates for Neural State Estimation
"... Abstract — Characterizing the dynamics of neural data by a discrete state variable is desirable in experimental analysis and brainmachine interfaces. Previous successes have used dynamical modeling including Hidden Markov Models, but the methods do not always produce meaningful results without bein ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract — Characterizing the dynamics of neural data by a discrete state variable is desirable in experimental analysis and brainmachine interfaces. Previous successes have used dynamical modeling including Hidden Markov Models, but the methods do not always produce meaningful results without being carefully trained or initialized. We propose unsupervised clustering in the spatiotemporal space of neural data using time embedding and a corresponding distance measure. By defining performance measures, the method parameters are investigated for a set of neural and simulated data with promising results. Our investigations demonstrate a different view of how to extract information to maximize the utility of state estimation. I.
Statistical analysis of neural data: Discretespace hidden Markov models
, 2009
"... 1.1 Example: the switching Poisson model is a simple model for spike trains which flip between a few distinct firing states...................... 5 1.2 Example: ion channels are often modeled as HMMs............... 7 ..."
Abstract
 Add to MetaCart
(Show Context)
1.1 Example: the switching Poisson model is a simple model for spike trains which flip between a few distinct firing states...................... 5 1.2 Example: ion channels are often modeled as HMMs............... 7
Journal of Neuroscience Methods
"... This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal noncommercial research and education use, including for instruction at the authors institution and sharing with colleagues. Other uses, including reproduction and distribution, or sel ..."
Abstract
 Add to MetaCart
(Show Context)
This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal noncommercial research and education use, including for instruction at the authors institution and sharing with colleagues. Other uses, including reproduction and distribution, or selling or licensing copies, or posting to personal, institutional or third party websites are prohibited. In most cases authors are permitted to post their version of the article (e.g. in Word or Tex form) to their personal website or institutional repository. Authors requiring further information regarding Elsevier’s archiving and manuscript policies are encouraged to visit: