## Learning Stochastic Regular Grammars by Means of a State Merging Method (1994)

Citations: | 136 - 12 self |

### BibTeX

@INPROCEEDINGS{Carrasco94learningstochastic,

author = {Rafael C. Carrasco and Jose Oncina},

title = {Learning Stochastic Regular Grammars by Means of a State Merging Method},

booktitle = {},

year = {1994},

pages = {139--152},

publisher = {Springer-Verlag}

}

### Years of Citing Articles

### OpenURL

### Abstract

We propose a new Mgorithm which allows for the identification of any stochastic deterministic regular language as well as the determination of the probabilities of the strings in the language. The algorithm builds the prefix tree acceptor from the sample set and merges systematically equivaJent states. Experimentally, it proves very fast a.ad the time needed grows only linearly with the size of the sample set.

### Citations

1886 | An Introduction to Probability Theory and its Applications - Feller - 1968 |

1496 | Probability inequalities for sums of bounded random variables, J.Amer.Statist - Hoeffding - 1963 |

220 | Complexity of Automaton Identification from Given Data - Gold - 1978 |

204 | Syntactic Pattern Recognition and Applications - Fu - 1982 |

172 |
Learning and extracting finite state automata with second-order recurrent neural networks
- Giles, Miller, et al.
- 1992
(Show Context)
Citation Context ...t TIC93-0633-C02-02 from ClCYT (Programs NacionM de Tecnologias de la Informci6n y de las Comuaicaciones) 140 In the last years, neural network models were used in order to identify regular languages =-=[5, 6, 7, 8]-=- and they have been applied to the problem of stochastic samples[9]. However, these me[hods share the serious drawback that long computational limes and vast sample sets are needed. Hidden Markov mode... |

82 |
Induction of finite-state languages using second-order recurrent networks
- Watrous, Kuhn
- 1992
(Show Context)
Citation Context ...t TIC93-0633-C02-02 from ClCYT (Programs NacionM de Tecnologias de la Informci6n y de las Comuaicaciones) 140 In the last years, neural network models were used in order to identify regular languages =-=[5, 6, 7, 8]-=- and they have been applied to the problem of stochastic samples[9]. However, these me[hods share the serious drawback that long computational limes and vast sample sets are needed. Hidden Markov mode... |

41 |
Identifying Languages from Stochastic Examples
- Angluin
- 1988
(Show Context)
Citation Context ...i.e., only strings in the language) is given, but they can be identified if a complete presentation (where all strings are classified as belonging or not to the language) is provided. However, Angluin=-=[2]-=- proved that a wide range of distribution classes, including the SRL, are identifiable from positive samples (text) with probability one. With this aim, some attempts to find suitable learning procedu... |

36 | P.: Identifying regular languages in polynomial time
- Oncina, Garcia
- 1992
(Show Context)
Citation Context ...find an algorithm which identifies in the limit stochastic regular languages and whose complexity does not grow exponentially with the size of S. Our approach wilt be based on the one proposed in ref.=-=[11]-=- for the identification of (nonstochastic) regular languages. For this reason, we will briefly describe it in the following. Given a language L, the minimum DFA generating L is called the canonical ac... |

10 |
Implicit learning of artificial grammars. Journal of Verbal Learning and Verbal Behavior
- Reber
- 1967
(Show Context)
Citation Context ...ted with a variety of grammars. For each grammax, different samples were generated by the canonical stochastic automaton of the grammar and given as input for .LF.J:t:D,. For instance, the Reber gramm=-=[16]-=- of fig. 5 has been used in order to compaxe aLtal;/ with previous works on neural networks which used this grammax as check[9]. In fig. 6 we plot the average number of nodes in the automaton found by... |

7 |
Inference of Finite-State Probabilistic Grammars
- Maryanski, Booth
- 1977
(Show Context)
Citation Context ...es, including the SRL, are identifiable from positive samples (text) with probability one. With this aim, some attempts to find suitable learning procedures have already been done. Maryaaski and Booth=-=[3]-=- used a chi-square test in order to filter regular grammars provided by heuristic methods. Although convergence to the true one was not guaranteed, acceptable grammars were always found. The approach ... |

3 |
Simulation of stochastic regular grammars through simple recurrent networks
- Castano, Casacuberta, et al.
- 1993
(Show Context)
Citation Context ...ci6n y de las Comuaicaciones) 140 In the last years, neural network models were used in order to identify regular languages [5, 6, 7, 8] and they have been applied to the problem of stochastic samples=-=[9]-=-. However, these me[hods share the serious drawback that long computational limes and vast sample sets are needed. Hidden Markov models are used by Stolcke and Omohundro[10t in order to maximize the p... |

1 |
On the Inference of Stochastic Regular Grxamars. lnforrnaton and Control 38
- Mude, Walker
- 1978
(Show Context)
Citation Context ...n order to filter regular grammars provided by heuristic methods. Although convergence to the true one was not guaranteed, acceptable grammars were always found. The approach of van der Mude md Walker=-=[4]-=- merges variables in a stochastic regular grammar, where Bayesian criteria are applied. In that paper[4], convergence to the true grammar was not proved and the algorithm showed too slow for apphcatio... |

1 |
aad D. Zipser: Learning Sequential Structure with the IteM-Time Recurrent Learning Algorithm
- Smith
- 1989
(Show Context)
Citation Context ...t TIC93-0633-C02-02 from ClCYT (Programs NacionM de Tecnologias de la Informci6n y de las Comuaicaciones) 140 In the last years, neural network models were used in order to identify regular languages =-=[5, 6, 7, 8]-=- and they have been applied to the problem of stochastic samples[9]. However, these me[hods share the serious drawback that long computational limes and vast sample sets are needed. Hidden Markov mode... |

1 |
The Induction of Dynamical Recogzer
- Pollack
(Show Context)
Citation Context |

1 | Hidden Markov Model Induction by B.yesian Model Merging. To appear in - Stolcke - 1993 |