Results 1  10
of
619
The Foundations of CostSensitive Learning
 In Proceedings of the Seventeenth International Joint Conference on Artificial Intelligence
, 2001
"... This paper revisits the problem of optimal learning and decisionmaking when different misclassification errors incur different penalties. We characterize precisely but intuitively when a cost matrix is reasonable, and we show how to avoid the mistake of defining a cost matrix that is economically i ..."
Abstract

Cited by 402 (6 self)
 Add to MetaCart
This paper revisits the problem of optimal learning and decisionmaking when different misclassification errors incur different penalties. We characterize precisely but intuitively when a cost matrix is reasonable, and we show how to avoid the mistake of defining a cost matrix that is economically
Interpolation revisited
 IEEE Transactions on Medical Imaging
, 2000
"... Abstract—Based on the theory of approximation, this paper presents a unified analysis of interpolation and resampling techniques. An important issue is the choice of adequate basis functions. We show that, contrary to the common belief, those that perform best are not interpolating. By opposition to ..."
Abstract

Cited by 198 (33 self)
 Add to MetaCart
to traditional interpolation, we call their use generalized interpolation; they involve a prefiltering step when correctly applied. We explain why the approximation order inherent in any basis function is important to limit interpolation artifacts. The decomposition theorem states that any basis function endowed
Cox's Theorem Revisited
 Journal of Artificial Intelligence Research
, 1999
"... The assumptions needed to prove Cox's Theorem are discussed and examined. Various sets of assumptions under which a Coxstyle theorem can be proved are provided, although all are rather strong and, arguably, not natural. I recently wrote a paper (Halpern, 1999) casting doubt on how compelling ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
The assumptions needed to prove Cox's Theorem are discussed and examined. Various sets of assumptions under which a Coxstyle theorem can be proved are provided, although all are rather strong and, arguably, not natural. I recently wrote a paper (Halpern, 1999) casting doubt on how compelling
Double Description Method Revisited
, 1996
"... . The double description method is a simple and useful algorithm for enumerating all extreme rays of a general polyhedral cone in IR d , despite the fact that we can hardly state any interesting theorems on its time and space complexities. In this paper, we reinvestigate this method, introduce som ..."
Abstract

Cited by 109 (2 self)
 Add to MetaCart
. The double description method is a simple and useful algorithm for enumerating all extreme rays of a general polyhedral cone in IR d , despite the fact that we can hardly state any interesting theorems on its time and space complexities. In this paper, we reinvestigate this method, introduce
The Conservation Theorem revisited
, 1993
"... This paper describes a method of proving strong normalization based on an extension of the conservation theorem. We introduce a structural notion of reduction that we call fi S , and we prove that any term that has a fi I fi Snormal form is strongly finormalizable. We show how to use this result ..."
Abstract

Cited by 37 (0 self)
 Add to MetaCart
This paper describes a method of proving strong normalization based on an extension of the conservation theorem. We introduce a structural notion of reduction that we call fi S , and we prove that any term that has a fi I fi Snormal form is strongly finormalizable. We show how to use
Revisiting the Miles and Snow Strategic Framework: Uncovering
 Interrelationships Between Strategic Types, Capabilities, Environmental Uncertainty, and Firm Performance, Strategic Management Journal
, 2005
"... The Miles and Snow strategic type framework is reexamined with respect to interrelationships with several theoretically relevant batteries of variables, including SBU strategic capabilities, environmental uncertainty, and performance. A newly developed constrained, multiobjective, classification m ..."
Abstract

Cited by 50 (1 self)
 Add to MetaCart
The Miles and Snow strategic type framework is reexamined with respect to interrelationships with several theoretically relevant batteries of variables, including SBU strategic capabilities, environmental uncertainty, and performance. A newly developed constrained, multiobjective, classification
The Graves Theorem Revisited
, 1996
"... this paper, we prove that the Graves theorem is a consequence of the following general result: the openness with linear rate of a locally closed setvalued map F around a point (x 0 ; y 0 ) of its graph is invariant with respect to a perturbation of the form f +F provided that the strict derivative ..."
Abstract

Cited by 22 (7 self)
 Add to MetaCart
this paper, we prove that the Graves theorem is a consequence of the following general result: the openness with linear rate of a locally closed setvalued map F around a point (x 0 ; y 0 ) of its graph is invariant with respect to a perturbation of the form f +F provided that the strict derivative
Markov’s Theorem revisited
 J. Approx. Theory
"... Abstract. The fact that Markov’sTheoremholds for determinatemeasuresisoften overlooked and the theorem is stated for measures with compact support as did Markov. We shall give a brief survey of the history of the theorem as well as a proof in the determinate case. We also prove a version of Markov’s ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
Abstract. The fact that Markov’sTheoremholds for determinatemeasuresisoften overlooked and the theorem is stated for measures with compact support as did Markov. We shall give a brief survey of the history of the theorem as well as a proof in the determinate case. We also prove a version of Markov
Standardization Theorem Revisited
, 1996
"... Standardization theorem is one of the most important theory for foundation of computation. Since it was firstly referred to by Curry and Feys, much effort have been made by several researchers. One of the most important work was done by Klop, who brought a nice idea to treat standardization in elega ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Standardization theorem is one of the most important theory for foundation of computation. Since it was firstly referred to by Curry and Feys, much effort have been made by several researchers. One of the most important work was done by Klop, who brought a nice idea to treat standardization
Results 1  10
of
619