Results 1  10
of
17
Inferring Conservation Laws in Particle Physics: A Case Study
 in the Problem of Induction”, The British Journal for the Philosophy of Science, Forthcoming
, 2001
"... This paper develops a meansends analysis of an inductive problem that arises in particle physics: how to infer from observed reactions conservation principles that govern all reactions among elementary particles. I show that there is a reliable inference procedure that is guaranteed to arrive at an ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
This paper develops a meansends analysis of an inductive problem that arises in particle physics: how to infer from observed reactions conservation principles that govern all reactions among elementary particles. I show that there is a reliable inference procedure that is guaranteed to arrive at an empirically adequate set of conservation principles as more and more evidence is obtained. An interesting feature of reliable procedures for finding conservation principles is that in certain precisely defined circumstances they must introduce hidden particles. Among the reliable inductive methods there is a unique procedure that minimizes convergence time as well as the number of times that the method revises its conservation principles. Thus the aims of reliable, fast and steady convergence to an empirically adequate theory single out a unique optimal inference for a given set of observed reactions–including prescriptions for when exactly to introduce hidden particles.
Recursion Theoretic Models of Learning: Some Results and Intuitions
 Annals of Mathematics and Artificial Intelligence
, 1995
"... View of Learning To implement a program that somehow "learns" it is neccessary to fix a set of concepts to be learned and develop a representation for the concepts and examples of the concepts. In order to investigate general properties of machine learning it is neccesary to work in as re ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
View of Learning To implement a program that somehow "learns" it is neccessary to fix a set of concepts to be learned and develop a representation for the concepts and examples of the concepts. In order to investigate general properties of machine learning it is neccesary to work in as representation independent fashion as possible. In this work, we consider machines that learn programs for recursive functions. Several authors have argued that such studies are general enough to include a wide array of learning situations [2,3,22,23,24]. For example, a behavior to be learned can be modeled as a set of stimulus and response pairs. Assuming that any behavior associates only one response to each possible stimulus, behaviors can be viewed as functions from stimuli to responses. Some behaviors, such as anger, are not easily modeled as functions. Our primary interest, however, concerns the learning of fundamental behaviors such as reading (mapping symbols to sounds), recognition (mapping pa...
Machine induction without revolutionary changes in hypothesis size
 Information and Computation
, 1996
"... This paper provides a beginning study of the effects on inductive inference of paradigm shifts whose absence is approximately modeled by various formal approaches to forbidding large changes in the size of programs conjectured. One approach, called severely parsimonious, requires all the programs co ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
This paper provides a beginning study of the effects on inductive inference of paradigm shifts whose absence is approximately modeled by various formal approaches to forbidding large changes in the size of programs conjectured. One approach, called severely parsimonious, requires all the programs conjectured on the way to success to be nearly (i.e., within a recursive function of) minimal size. It is shown that this very conservative constraint allows learning infinite classes of functions, but not infinite r.e. classes of functions. Another approach, called nonrevolutionary, requires all conjectures to be nearly the same size as one another. This quite conservative constraint is, nonetheless, shown to permit learning some infinite r.e. classes of functions. Allowing up to one extra bounded size mind change towards a final program learned certainly doesn’t appear revolutionary. However, somewhat surprisingly for scientific (inductive) inference, it is shown that there are classes learnable with the nonrevolutionary constraint (respectively, with severe parsimony), up to (i + 1) mind changes, and no anomalies, which classes cannot be learned with no size constraint, an unbounded, finite number of anomalies in the final program, but with no more than i mind changes. Hence, in some cases, the possibility of one extra mind change is considerably more liberating than removal of very conservative size shift constraints. The proofs of these results are also combinatorially interesting. 1
Parsimony Hierarchies for Inductive Inference
 JOURNAL OF SYMBOLIC LOGIC
, 2004
"... Freivalds defined an acceptable programming system independent criterion for learning programs for functions in which the final programs were required to be both correct and "nearly" minimal size, i.e, within a computable function of being purely minimal size. Kinber showed that this parsi ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Freivalds defined an acceptable programming system independent criterion for learning programs for functions in which the final programs were required to be both correct and "nearly" minimal size, i.e, within a computable function of being purely minimal size. Kinber showed that this parsimony requirement on final programs limits learning power. However, in scientific inference, parsimony is considered highly desirable. A limcomputable function is (by definition) one calculable by a total procedure allowed to change its mind finitely many times about its output. Investigated is the possibility of assuaging somewhat the limitation on learning power resulting from requiring parsimonious final programs by use of criteria which require the final, correct programs to be "notsonearly" minimal size, e.g., to be within a limcomputable function of actual minimal size. It is shown that some parsimony in the final program is thereby retained, yet learning power strictly increases. Considered, then, are limcomputable functions as above but for which notations for constructive ordinals are used to bound the number of mind changes allowed regarding the output. This is a variant of an idea introduced by Freivalds and Smith. For this ordinal notation complexity bounded version of limcomputability, the power of the resultant learning criteria form finely graded, infinitely ramifying, infinite hierarchies intermediate between the computable and the limcomputable cases. Some of these hierarchies, for the natural notations determining them, are shown to be optimally tight.
Measure, Category and Learning Theory
"... Measure and category (or rather, their recursion theoretical counterparts) have been used in Theoretical Computer Science to make precise the intuitive notion "for most of the recursive sets." We use the Supported in part by NSF Grant CCR 9253582. y Supported in part by Latvian Counc ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Measure and category (or rather, their recursion theoretical counterparts) have been used in Theoretical Computer Science to make precise the intuitive notion "for most of the recursive sets." We use the Supported in part by NSF Grant CCR 9253582. y Supported in part by Latvian Council of Science Grant 93.599 and NSF Grant 9119540. z Supported in part by NSF Grant 9301339. x Supported in part by NSF Grants 9119540 and 9301339.  Supported by the Deutsche Forschungsgemeinschaft (DFG) Grant Me 672/41. notions of effective measure and category to discuss the relative sizes of inferrible sets, and their complements. We find that inferrible sets become large rather quickly in the standard hierarchies of learnability. On the other hand, the complements of the learnable sets are all large. 1 Introduction Determining the relative size of denumerable sets, and those with cardinality @ 1 , led mathematicians do develop the notions of measure and category [Oxt71]. Described in t...
Hard Choices in Scientific Inquiry
, 1997
"... Contents 1 Induction: The Problem and How To Solve It 7 1.1 The Problem of Induction . . . . . . . . . . . . . . . . . . . . . . 7 1.2 Hypothetical Imperatives for Inductive Inference . . . . . . . . . 9 1.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.3.1 MeansEn ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Contents 1 Induction: The Problem and How To Solve It 7 1.1 The Problem of Induction . . . . . . . . . . . . . . . . . . . . . . 7 1.2 Hypothetical Imperatives for Inductive Inference . . . . . . . . . 9 1.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.3.1 MeansEnds Vindications of Traditional Proposals . . . . 11 1.3.2 Novel Solutions to Traditional Problems . . . . . . . . . . 11 1.3.3 New Questions and Answers . . . . . . . . . . . . . . . . . 12 1.3.4 Analysis of Inductive Problems from Scientific Practice . 14 1.3.5 Rational Choice in Games . . . . . . . . . . . . . . . . . . 14 1.4 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 2 A Model of Scientific Inquiry 17 2.1 Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.2 A Model of Scientific Inquiry . . . . . . . . . . . . . . . . . . . . 18 2.3 Examples of Inductive Problems and Scientific Methods . . . . . 26
On the Relative Sizes of Learnable Sets
"... Measure and category (or rather, their recursiontheoretical counterparts) have been used in theoretical computer science to make precise the intuitive notion "for most of the recursive sets." We use the notions of effective measure and category to discuss the relative sizes of inferrible ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Measure and category (or rather, their recursiontheoretical counterparts) have been used in theoretical computer science to make precise the intuitive notion "for most of the recursive sets." We use the notions of effective measure and category to discuss the relative sizes of inferrible sets, and their complements. We find that inferable sets become large rather quickly in the standard hierarchies of learnability. On the other hand, the complements of the learnable sets are all large. 1 Introduction Determining the relative size of denumerable sets, and those with cardinality @ 1 , led mathematicians to develop the notions of measure and category [Oxt71]. We investigate an application of measure and category techniques to a branch of learning theory called inductive inference [AS83]. The models of learning used in this field have been inspired by features of human learning. The goal of this work is to determine the relative sizes of classes of inferable sets of functions. The idea ...
Learning Recursive Functions: A Survey
, 2008
"... Studying the learnability of classes of recursive functions has attracted considerable interest for at least four decades. Starting with Gold’s (1967) model of learning in the limit, many variations, modifications and extensions have been proposed. These models differ in some of the following: the m ..."
Abstract
 Add to MetaCart
Studying the learnability of classes of recursive functions has attracted considerable interest for at least four decades. Starting with Gold’s (1967) model of learning in the limit, many variations, modifications and extensions have been proposed. These models differ in some of the following: the mode of convergence, the requirements intermediate hypotheses have to fulfill, the set of allowed learning strategies, the source of information available to the learner during the learning process, the set of admissible hypothesis spaces, and the learning goals. A considerable amount of work done in this field has been devoted to the characterization of function classes that can be learned in a given model, the influence of natural, intuitive postulates on the resulting learning power, the incorporation of randomness into the learning process, the complexity of learning, among others. On the occasion of Rolf Wiehagen’s 60th birthday, the last four decades of research in that area are surveyed, with a special focus on Rolf Wiehagen’s work, which has made him one of the most influential scientists in the theory of learning recursive functions.