Results 1 
8 of
8
Beyond the Turing Test
 J. Logic, Language & Information
"... Abstract. We define the main factor of intelligence as the ability to comprehend, formalising this ability with the help of new constructs based on descriptional complexity. The result is a comprehension test, or Ctest, exclusively defined in terms of universal descriptional machines (e.g universal ..."
Abstract

Cited by 34 (18 self)
 Add to MetaCart
Abstract. We define the main factor of intelligence as the ability to comprehend, formalising this ability with the help of new constructs based on descriptional complexity. The result is a comprehension test, or Ctest, exclusively defined in terms of universal descriptional machines (e.g universal Turing machines). Despite the absolute and nonanthropomorphic character of the test it is equally applicable to both humans and machines. Moreover, it correlates with classical psychometric tests, thus establishing the first firm connection between information theoretic notions and traditional IQ tests. The Turing Test is compared with the Ctest and their joint combination is discussed. As a result, the idea of the Turing Test as a practical test of intelligence should be left behind, and substituted by computational and factorial tests of different cognitive abilities, a much more useful approach for artificial intelligence progress and for many other intriguing questions that are presented beyond the Turing Test.
Connectionist and statistical approaches to language acquisition: A distributional perspective
 Language and Cognitive Processes
, 1998
"... We propose that one important role for connectionist research in language acquisition is analysing what linguistic information is present in the child’s input. Recent connectionist and statistical work analysing the properties of real language corpora suggest that a priori objections against the uti ..."
Abstract

Cited by 27 (4 self)
 Add to MetaCart
We propose that one important role for connectionist research in language acquisition is analysing what linguistic information is present in the child’s input. Recent connectionist and statistical work analysing the properties of real language corpora suggest that a priori objections against the utility of distributional information for the child are misguided. We illustrate our argument with examples of connectionist and statistical corpusbased research on phonology, segmentation, morphology, word classes, phrase structure, and lexical semantics. We discuss how this research relates to other empirical and theoretical approaches to the study of language acquisition.
Probabilistic DFA Inference using KullbackLeibler Divergence and Minimality
 In Seventeenth International Conference on Machine Learning
, 2000
"... Probabilistic DFA inference is the problem of inducing a stochastic regular grammar from a positive sample of an unknown language. The ALERGIA algorithm is one of the most successful approaches to this problem. In the present work we review this algorithm and explain why its generalization criterion ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
Probabilistic DFA inference is the problem of inducing a stochastic regular grammar from a positive sample of an unknown language. The ALERGIA algorithm is one of the most successful approaches to this problem. In the present work we review this algorithm and explain why its generalization criterion, a state merging operation, is purely local. This characteristic leads to the conclusion that there is no explicit way to bound the divergence between the distribution de ned by the solution and the training set distribution (that is, to control globally the generalization from the training sample). In this paper we present an alternative approach, the MDI algorithm, in which the solution is a probabilistic automaton that trades o minimal divergence from the training sample and minimal size. An e cient computation of the KullbackLeibler divergence between two probabilistic DFAs is described, from which the new learning criterion is derived. Empirical results in the d...
Reviewed by
"... This collection of invited papers covers a lot of ground in its nearly 800 pages, so any review of reasonable length will necessarily be selective. However, there are a number of features that make the book as a whole a comparatively easy and thoroughly rewarding read. Multiauthor compendia of this ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
This collection of invited papers covers a lot of ground in its nearly 800 pages, so any review of reasonable length will necessarily be selective. However, there are a number of features that make the book as a whole a comparatively easy and thoroughly rewarding read. Multiauthor compendia of this kind are often disjointed, with very little uniformity from chapter to chapter in terms of breadth, depth, and format. Such is not the case here. Breadth and depth of treatment are surprisingly consistent, with coherent formats that often include both a little history of the field and some thoughts about the future. The volume has a very logical structure in which the chapters flow and follow on from each other in an orderly fashion. There are also many crossreferences between chapters, which allow the authors to build upon the foundation of one another’s work and eliminate redundancies. Specifically, the contents consist of 38 survey papers grouped into three parts: Fundamentals; Processes, Methods, and Resources; and Applications. Taken together, they provide both a comprehensive introduction to the field and a useful reference volume. In addition to the usual author and subject matter indices, there is a substantial
Logically Reliable Inductive Inference
, 2005
"... This paper aims to be a friendly introduction to formal learning theory. I introduce key concepts at a slow pace, comparing and contrasting with other approaches to inductive inference such as confirmation theory. A number of examples are discussed, some in detail, such as Goodman’s Riddle of Induct ..."
Abstract
 Add to MetaCart
This paper aims to be a friendly introduction to formal learning theory. I introduce key concepts at a slow pace, comparing and contrasting with other approaches to inductive inference such as confirmation theory. A number of examples are discussed, some in detail, such as Goodman’s Riddle of Induction. I outline some important results of formal learning theory that are of philosophical interest. Finally, I discuss recent developments in this approach to inductive inference.
unknown title
"... Inferring finite transducers We consider the inference problem for finite transducers using different kinds of samples (positive and negative samples, positive samples only, and structural samples). Given pairs of input and output words, our task is to infer the finite transducer consistent with the ..."
Abstract
 Add to MetaCart
Inferring finite transducers We consider the inference problem for finite transducers using different kinds of samples (positive and negative samples, positive samples only, and structural samples). Given pairs of input and output words, our task is to infer the finite transducer consistent with the given pairs. We show that this problem can be solved in certain special cases by using known results on the inference problem for linear languages.
The Signicance of Errors to Parametric Models of Language Acquisition
"... The aim of this research is to investigate the process of grammatical acquisition from real data. In this paper we address the issue of errors. We demonstrate by simulation how a learning system may be robust when statistical error handling methods are employed. Classication of Input Data A normal ..."
Abstract
 Add to MetaCart
The aim of this research is to investigate the process of grammatical acquisition from real data. In this paper we address the issue of errors. We demonstrate by simulation how a learning system may be robust when statistical error handling methods are employed. Classication of Input Data A normal child becomes rapidly uent in their native language despite an absence of any formal language teaching. The child is exposed to evidence of her target language that must exclusively belong to one of three possible classes: positive evidence is information that describes which utterances are allowed in the target language; negative evidence is information that describes which utterances are not allowed in the target language; errors are pieces of informa