## On Duality in Learning and the Selection of Learning Teams (1996)

Venue: | Information and Computation |

Citations: | 3 - 2 self |

### BibTeX

@ARTICLE{Apsitis96onduality,

author = {Kalvis Apsitis and Rusins Freivalds and Carl H. Smith},

title = {On Duality in Learning and the Selection of Learning Teams},

journal = {Information and Computation},

year = {1996},

volume = {129}

}

### OpenURL

### Abstract

Previous work in inductive inference dealt mostly with finding one or several machines (IIMs) that successfully learn a collection of functions. Herein we start with a class of functions and consider the learner set of all IIMs that are successful at learning the given class. Applying this perspective to the case of team inference leads to the notion of diversification for a class of functions. This enables us to distinguish between several flavors of IIMs all of which must be represented in a team learning the given class. 2 1 Introduction All current theoretical approaches to machine learning tend to focus on a particular machine or a collection of machines and then find the class of concepts which can be learned by these machines under certain constraints defining a criterion of successful learning [AS83, OSW86]. In this paper we investigate the dual problem: Given some set of concepts, which algorithms can learn all those concepts? From [AGS89] we know that in the theory ...

### Citations

893 |
Language identification in the limit
- Gold
- 1967
(Show Context)
Citation Context ...tively as follows: "Machine M identifies (or learns) all functions f 2 L(M) in the sense of type L". In this paper L will denote any one of the types EX, EX n , FIN to be defined below. Defi=-=nition 1 ([Gol67]-=-) Machine M learns recursive function f in the limit, written: f 2 EX(M), if for any sequence of input strings (oe n ) % f we have sequence of conjectures M(oe n ) converging to some correct index h o... |

306 | Inductive Inference: Theory and Methods - Angluin, Smith - 1983 |

259 | Toward a mathematical theory of inductive inference - Blum, Blum - 1975 |

163 |
Comparison of identification criteria for machine inductive inference, Theoretical Computer Science
- Case, Smith
- 1983
(Show Context)
Citation Context ...; \Delta \Delta \Delta ; n + 1, and M(oe k i ) 6= M(oe k i+1 ), i = 1; \Delta \Delta \Delta ; n. Moreover, we demand that there is no subsequence of length n + 2 with the same property. Definition 3 (=-=[CS83]-=-) Machine M learns a recursive function f in the limit with a mind change bound n, written: f 2 EX n (M), if f 2 EX(M) and the number of mind changes, committed by M on any (oe k ) %f does not exceed ... |

114 | Systems that Learn - Osherson, Stob, et al. - 1986 |

101 | An Introduction to the General Theory of Algorithms - Machtey, Young - 1978 |

73 |
General topology, Heldermann
- Engelking
- 1989
(Show Context)
Citation Context ...U; V be arbitrary classes of recursive functions. We summarize some of the previous results: ffl U ` U , ffl U ` V ) U ` V , ffl U = U . But U [ V 6= U [ V , therefore U ! U is not a closure operator =-=[Eng89]-=-. To see that U [ V 6= U [ V , consider U; V 2 EX such that U [ V 62 EX, as in Theorem 10. Then U [ V = F , but U [ V = U [ V 6= F . Therefore the relation EX ae [1; 2]EX upsets this first attempt to ... |

61 |
Theory of Recursive Functions and Effective Computability
- Jr
- 1967
(Show Context)
Citation Context ...learnable by a 3 single IIM. 2 Preliminaries The set of all natural numbers is denoted by IN, the set of all single argument recursive functions by F , and the set of partial recursive functions by P =-=[Rog67]-=-. Letters h; i; j; k; l; m; n; x; y vary over IN, f and g over F , and '; / over P. Classes of recursive functions, i.e. subsets of F , are denoted by U , V , W , with or without decorations. By using... |

56 |
The power of pluralism for automatic program synthesis
- Smith
- 1982
(Show Context)
Citation Context ...learning has been a prevalent theme in the study of inductive inference [Smi94b]. An early result asserts that the larger the team allowed, the larger the class of learnable sets of functions becomes =-=[Smi82]-=-. This suggests that perhaps different types of knowledge are needed to solve some problems. Indeed, most of the papers in our field have at least two coauthors. A precise correspondence between team ... |

40 |
Probability and plurality for aggregations of learning machines
- Pitt, Smith
- 1988
(Show Context)
Citation Context ...ed from EX by team inference are reducible to them. The complete picture for the case EX is given by the following two theorems: Theorem 7 ([Smi82]) For any m ? 0, [1; m]EX ae [1; m+ 1]EX. Theorem 8 (=-=[PS88]-=-) For any nsm ? 0, [m; n]EX = [1; bn=mc]EX For the types EX n the picture of team inclusions is more complicated. We shall need these two properties: Theorem 9 ([KF92, AFK + 92]) For any ns0, EX n ae ... |

33 |
Probabilistic Inductive Inference
- Pitt
- 1985
(Show Context)
Citation Context ...nt types of knowledge are needed to solve some problems. Indeed, most of the papers in our field have at least two coauthors. A precise correspondence between team learning and probabilistic learning =-=[Pit89]-=- intensified the study of team learning. So far, all of the studies of team learning focus on how team size compares with other parameters relevant to the learning process [Sch86, PS88, FSV89, KZ91, D... |

26 | Two theorems on the limiting synthesis of functions - Barzdins - 1974 |

21 | A Recursive Introduction to the Theory of Computation - Smith - 1994 |

18 | Relations between probabilistic and team one-shot learners - Daley, Pitt, et al. - 1991 |

16 | Breaking the probability 1 2 barrier in FIN-type learning - Daley, Kalyanasundaram, et al. - 1995 |

16 |
Three decades of team learning
- Smith
(Show Context)
Citation Context ...ate that this is indeed the case, but only if instead of a single concept we consider a suitable infinite set of concepts. Team learning has been a prevalent theme in the study of inductive inference =-=[Smi94b]-=-. An early result asserts that the larger the team allowed, the larger the class of learnable sets of functions becomes [Smi82]. This suggests that perhaps different types of knowledge are needed to s... |

12 | One-sided error probabilistic inductive inference and reliable frequency identification - Kinber, Zeugmann - 1991 |

8 | Training sequences
- Angluin, Gasarch, et al.
- 1989
(Show Context)
Citation Context ...n constraints defining a criterion of successful learning [AS83, OSW86]. In this paper we investigate the dual problem: Given some set of concepts, which algorithms can learn all those concepts? From =-=[AGS89]-=- we know that in the theory of inductive inference sometimes one concept must be mastered before the learning of another concept can be initiated. This observation is consistent with the common human ... |

7 | Capabilities of fallible finite learning - Daley, Kalyanasundaram, et al. - 1993 |

4 | Trade--offs amongst parameters effecting the inductive inferribility of classes of recursive functions - FREIVALDS, SMITH, et al. - 1989 |

3 | Choosing a learning team: a topological approach
- ApsÄ«tis, Freivalds, et al.
- 1994
(Show Context)
Citation Context ...usins@cclu.lv Carl H. Smith Department of Computer Science University of Maryland College Park, MD 20742, USA smith@cs.umd.edu Results collected in this paper were presented previously at conferences =-=[AFS94]-=- and [FS93]. 1 Abstract Previous work in inductive inference dealt mostly with finding one or several machines (IIMs) that successfully learn a collection of functions. Herein we start with a class of... |

3 | Some results in the theory of effective program synthesis: Learning by defective information - Schafer - 1986 |

2 | Unions of identifiable classes of total recursive functions - Apsitis, Freivalds, et al. - 1992 |

2 |
On the duality between mechanistic learners and what it is they learn
- Freivalds, Smith
- 1997
(Show Context)
Citation Context ...v Carl H. Smith Department of Computer Science University of Maryland College Park, MD 20742, USA smith@cs.umd.edu Results collected in this paper were presented previously at conferences [AFS94] and =-=[FS93]-=-. 1 Abstract Previous work in inductive inference dealt mostly with finding one or several machines (IIMs) that successfully learn a collection of functions. Herein we start with a class of functions ... |

2 |
Topologie, 4 th edition, volume 20-21 of Monografie Matematyczne. Panstwowe Wydawnictwo Naukowe
- Kuratowski
- 1958
(Show Context)
Citation Context ...s T 0 only. The following lemma enables us to proceed to diversifications for infinite families of machines as well, provided that all their finite subfamilies have a diversification. Lemma 31 (Konig,=-=[Kur58]-=-) In any infinite directed tree in which each node has only a finite number of direct successors, there is an infinite path proceeding from the root. Lemma 32 Let W be a class of functions. There exis... |

1 | Inductive inference of total recursive functions by probabilistic and deterministric strategies - Krikis, Freivalds - 1992 |

1 |
Inductive inference of recursive functions
- Wiehagen
- 1974
(Show Context)
Citation Context ...support and let V = ff 2 F j' f(0) = fg be the self-describing functions. It is easy to verify that they both are in EX, but their union is not. Therefore, U [V 2 [1; 2]EX and U [V 62 EX. Theorem 11 (=-=[Wie74]-=-) There is a class U ae F and a function f such that U 2 FIN and U [ ffg 62 FIN . One such example is given below: f n (x) = ( 1 if n = x, 0 otherwise. ; U = ff n j n 2 INg ; f(x) = x[0]: Clearly, U [... |