• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 1,204
Next 10 →

Intensional Polymorphism in Type-Erasure Semantics

by Karl Crary, Stephanie Weirich, Greg Morrisett , 2002
"... Intensional polymorphism, the ability to dispatch to di#erent routines based on types at run time, enables a variety of advanced implementation techniques for polymorphic languages, including tag-free garbage collection, unboxed function arguments, polymorphic marshalling, and flattened data structu ..."
Abstract - Cited by 137 (36 self) - Add to MetaCart
structures. To date, languages that support intensional polymorphism have required a type-passing (as opposed to type-erasure) interpretation where types are constructed and passed to polymorphic functions at run time. Unfortunately, type-passing su#ers from a number of drawbacks: it requires duplication

Intensional Polymorphism in Type-Erasure Semantics

by Karl Crary Stephanie, Stephanie Weirich, Greg Morrisett - Journal of Functional Programming , 1998
"... Intensional polymorphism, the ability to dispatch to different routines based on types at run time, enables a variety of advanced implementation techniques for polymorphic languages, including tag-free garbage collection, unboxed function arguments, polymorphic marshalling, and flattened data struct ..."
Abstract - Add to MetaCart
structures. To date, languages that support intensional polymorphism have required a type-passing (as opposed to type-erasure) interpretation where types are constructed and passed to polymorphic functions at run time. Unfortunately, type-passing su#ers from a number of drawbacks; it requires duplication

Sparse Principal Component Analysis

by Hui Zou, Trevor Hastie, Robert Tibshirani - Journal of Computational and Graphical Statistics , 2004
"... Principal component analysis (PCA) is widely used in data processing and dimensionality reduction. However, PCA su#ers from the fact that each principal component is a linear combination of all the original variables, thus it is often di#cult to interpret the results. We introduce a new method ca ..."
Abstract - Cited by 279 (6 self) - Add to MetaCart
Principal component analysis (PCA) is widely used in data processing and dimensionality reduction. However, PCA su#ers from the fact that each principal component is a linear combination of all the original variables, thus it is often di#cult to interpret the results. We introduce a new method

Exploiting hardware performance counters with flow and context sensitive profiling

by Glenn Ammons, Thomas Ball, James R. Larus - ACM Sigplan Notices , 1997
"... A program pro le attributes run-time costs to portions of a program's execution. Most pro ling systems su er from two major de ciencies: rst, they only apportion simple metrics, such as execution frequency or elapsed time to static, syntactic units, such as procedures or statements; second, the ..."
Abstract - Cited by 254 (9 self) - Add to MetaCart
A program pro le attributes run-time costs to portions of a program's execution. Most pro ling systems su er from two major de ciencies: rst, they only apportion simple metrics, such as execution frequency or elapsed time to static, syntactic units, such as procedures or statements; second

Structural Models of Corporate Bond Pricing: An Empirical Analysis

by Young Ho Eom, Jean Helwege, Jing-zhi Huang , 2003
"... This paper empirically tests five structural models of corporate bond pricing: those of Merton (1974), Geske (1977), Leland and Toft (1996), Longsta# and Schwartz (1995), and Collin-Dufresne and Goldstein (2001). We implement the models using a sample of 182 bond prices from firms with simple capita ..."
Abstract - Cited by 245 (6 self) - Add to MetaCart
structural models predict spreads that are too high on average. Nevertheless, accuracy is a problem, as the newer models tend to severely overstate the credit risk of firms with high leverage or volatility and yet su#er from a spread underprediction problem with safer bonds. The Leland and Toft model

su

by Kalin Kouzmanov, Robert Moritz, Albrecht Von Quadt, Massimo Chiaradia, Irena Peytcheva, Denis Fontignie, Claire Ramboz, Kamen Bogdanov
"... er sio n 1 ..."
Abstract - Add to MetaCart
er sio n 1

Approximate String Joins in a Database (Almost) for Free - Erratum

by Luis Gravano, Panagiotis G. Ipeirotis, H. V. Jagadish, Nick Koudas, S. Muthukrishnan, Divesh Srivastava - In VLDB , 2003
"... case the result returned by the Figure 1 query is incomplete and su#ers from "false negatives," in contrast to our claim to the contrary in [GIJ 01b]. In general, the string pairs that are omitted are pairs of short strings. Even when these strings match within small edit distance, t ..."
Abstract - Cited by 210 (16 self) - Add to MetaCart
case the result returned by the Figure 1 query is incomplete and su#ers from "false negatives," in contrast to our claim to the contrary in [GIJ 01b]. In general, the string pairs that are omitted are pairs of short strings. Even when these strings match within small edit distance

Bias plus variance decomposition for zero-one loss functions

by Ron Kohavi - In Machine Learning: Proceedings of the Thirteenth International Conference , 1996
"... We present a bias-variance decomposition of expected misclassi cation rate, the most commonly used loss function in supervised classi cation learning. The bias-variance decomposition for quadratic loss functions is well known and serves as an important tool for analyzing learning algorithms, yet no ..."
Abstract - Cited by 212 (5 self) - Add to MetaCart
no decomposition was o ered for the more commonly used zero-one (misclassi cation) loss functions until the recent work of Kong & Dietterich (1995) and Breiman (1996). Their decomposition su ers from some major shortcomings though (e.g., potentially negative variance), which our decomposition avoids. We show

in su

by unknown authors
"... v er sio n 1 ..."
Abstract - Add to MetaCart
v er sio n 1

CPAR: Classification based on Predictive Association Rules

by Xiaoxin Yin, Jiawei Han , 2003
"... Recent studies in data mining have proposed a new classification approach, called associative classification, which, according to several reports, such as [7, 6], achieves higher classification accuracy than traditional classification approaches such as C4.5. However, the approach also su#ers from t ..."
Abstract - Cited by 199 (3 self) - Add to MetaCart
Recent studies in data mining have proposed a new classification approach, called associative classification, which, according to several reports, such as [7, 6], achieves higher classification accuracy than traditional classification approaches such as C4.5. However, the approach also su#ers from
Next 10 →
Results 1 - 10 of 1,204
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University