Results 11 
14 of
14
DCG Induction using MDL and Parsed Corpora
 Learning Language in Logic, pages 63–71, Bled,Slovenia
, 1999
"... We show how partial models of natural language syntax (manually written DCGs, with parameters estimated from a parsed corpus) can be automatically extended when trained upon raw text (using MDL). We also show how we can use a parsed corpus as an alternative constraint upon learning. Empirical ev ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
We show how partial models of natural language syntax (manually written DCGs, with parameters estimated from a parsed corpus) can be automatically extended when trained upon raw text (using MDL). We also show how we can use a parsed corpus as an alternative constraint upon learning. Empirical evaluation suggests that a parsed corpus is more informative than a MDLbased prior. However, best results are achieved when the learner is supervised with a compressionbased prior and a parsed corpus.
Algorithmic Information Theory and Machine Learning
, 2000
"... this paper we only consider the context of concept learning : Let X be a set called the instance space. A concept is a subset of X . Usually concepts are identied with their indicating function (by abuse of notations c(x) = 1 x 2 c) A concept class is a set C 2 ..."
Abstract
 Add to MetaCart
this paper we only consider the context of concept learning : Let X be a set called the instance space. A concept is a subset of X . Usually concepts are identied with their indicating function (by abuse of notations c(x) = 1 x 2 c) A concept class is a set C 2
Recent Progress in the Fields of Universal Learning Algorithms and Optimal Search
"... We briefly review recent results in the field of theoretically optimal algorithms for prediction, search, decision making, and reinforcement learning in environments of a very general type. The results may be relevant not only for computer science but also for physics. ..."
Abstract
 Add to MetaCart
We briefly review recent results in the field of theoretically optimal algorithms for prediction, search, decision making, and reinforcement learning in environments of a very general type. The results may be relevant not only for computer science but also for physics.
Using Literal and Grammatical Statistics
 Problems of Information Transmission
, 2000
"... Markov chains are used as a model for the sequence of elements of a natural language text. This model is applied for authorship attribution of texts. An element of a text could be a letter or a grammatical class of a word. It turns out that the frequencies of usage of letter pairs and pairs of g ..."
Abstract
 Add to MetaCart
Markov chains are used as a model for the sequence of elements of a natural language text. This model is applied for authorship attribution of texts. An element of a text could be a letter or a grammatical class of a word. It turns out that the frequencies of usage of letter pairs and pairs of grammatical classes are stable characteristics of the author, and they could be used in disputed authorship attribution. A comparison of results with letters and grammatical classes is given.