Extracting Comprehensible Models from Trained Neural Networks (1996)

by W. Craven
Citations:69 - 4 self

Documents Related by Co-Citation

4905 C4.5: Programs for Machine Learning – J R Quinlan - 1993
3874 Classification and Regression Trees – L Breiman, J H Friedman, R A Olshen, C J Stone - 1984
55 Knowledge Acquisition from Examples Via Multiple Models – Pedro Domingos - 1997
2479 Bagging Predictors – Leo Breiman, Leo Breiman - 1996
2863 UCI Repository of machine learning databases [http://www.ics.uci.edu/~mlearn/MLRepository.html – C L Blake, C J Merz - 1998
57 Exploring the decision forest: An empirical investigation of Occam’s razor in decision tree induction – Patrick M. Murphy, Michael J. Pazzani - 1994
3917 Pattern Classification and Scene Analysis – R O Duda, P Hart - 1973
233 Survey and critique of techniques for Extracting Rules from Trained Artificial Neural Networks – R Andrews, J Diederich, A B Tickle - 1995
1625 Experiments with a New Boosting Algorithm – Yoav Freund, Robert E. Schapire - 1996
88 Extracting Tree-Structured Representations of Trained Networks – Mark W. Craven, Jude W. Shavlik - 1996
8950 The Nature of Statistical Learning Theory – Vladimir N. Vapnik - 1995
52 Rl4: A tool for knowledge-based induction – S Clearwater, F Provost - 1990
3335 Induction of Decision Trees – J. R. Quinlan - 1986
66 The Effects of Training Set Size on Decision Tree Complexity – Oates - 1997
79 A Theory of Learning Classification Rules – Wray Lindsay Buntine - 1992
293 Inferring Decision Trees Using the Minimum Description Length Principle – J R Quinlan, R L Rivest - 1989
52 Lookahead and Pathology in Decision Tree Induction – Sreerama K. Murthy, Steven Salzberg - 1995
144 A conservation law for generalization performance – C Schaffer - 1994
2307 A Decision-Theoretic Generalization of on-Line Learning and an Application to Boosting – Yoav Freund, Robert E. Schapire - 1997