Results 1  10
of
3,900
Weighted Variance Swap
"... Let the underlying process Y be a semimartingale taking values in an interval I. Let ϕ : I → R be a difference of convex functions, and let X := ϕ(Y ). A typical application takes Y to be a positive price process and ϕ(y) = log y for y ∈ I = (0, ∞). Then [the floating leg of] a forwardstarting wei ..."
Abstract
 Add to MetaCart
starting weighted variance swap or generalized variance swap on ϕ(Y ) (shortened to "on Y " if the ϕ is understood), with weight process w t , forwardstart time θ, and expiry T , is defined to pay, at a fixed time where [·] denotes quadratic variation. In the case that θ = 0, the trade date, we have a
An empirical comparison of voting classification algorithms: Bagging, boosting, and variants.
 Machine Learning,
, 1999
"... Abstract. Methods for voting classification algorithms, such as Bagging and AdaBoost, have been shown to be very successful in improving the accuracy of certain classifiers for artificial and realworld datasets. We review these algorithms and describe a large empirical study comparing several vari ..."
Abstract

Cited by 707 (2 self)
 Add to MetaCart
and variance decomposition of the error to show how different methods and variants influence these two terms. This allowed us to determine that Bagging reduced variance of unstable methods, while boosting methods (AdaBoost and Arcx4) reduced both the bias and variance of unstable methods but increased
Image denoising using a scale mixture of Gaussians in the wavelet domain
 IEEE TRANS IMAGE PROCESSING
, 2003
"... We describe a method for removing noise from digital images, based on a statistical model of the coefficients of an overcomplete multiscale oriented basis. Neighborhoods of coefficients at adjacent positions and scales are modeled as the product of two independent random variables: a Gaussian vecto ..."
Abstract

Cited by 513 (17 self)
 Add to MetaCart
vector and a hidden positive scalar multiplier. The latter modulates the local variance of the coefficients in the neighborhood, and is thus able to account for the empirically observed correlation between the coefficient amplitudes. Under this model, the Bayesian least squares estimate of each
Boosting the margin: A new explanation for the effectiveness of voting methods
 IN PROCEEDINGS INTERNATIONAL CONFERENCE ON MACHINE LEARNING
, 1997
"... One of the surprising recurring phenomena observed in experiments with boosting is that the test error of the generated classifier usually does not increase as its size becomes very large, and often is observed to decrease even after the training error reaches zero. In this paper, we show that this ..."
Abstract

Cited by 897 (52 self)
 Add to MetaCart
that techniques used in the analysis of Vapnik’s support vector classifiers and of neural networks with small weights can be applied to voting methods to relate the margin distribution to the test error. We also show theoretically and experimentally that boosting is especially effective at increasing the margins
Linear models and empirical bayes methods for assessing differential expression in microarray experiments.
 Stat. Appl. Genet. Mol. Biol.
, 2004
"... Abstract The problem of identifying differentially expressed genes in designed microarray experiments is considered. Lonnstedt and Speed (2002) derived an expression for the posterior odds of differential expression in a replicated twocolor experiment using a simple hierarchical parametric model. ..."
Abstract

Cited by 1321 (24 self)
 Add to MetaCart
from spot filtering or spot quality weights. The posterior odds statistic is reformulated in terms of a moderated tstatistic in which posterior residual standard deviations are used in place of ordinary standard deviations. The empirical Bayes approach is equivalent to shrinkage of the estimated
Contributed paper The Comparison of Efficiency of Control Chart by Weighted Variance Method, Scaled Weighted Variance Method, Empirical Quantiles Method and Extremevalue Theory for Skewed Populations
, 2006
"... The objective of this study is to compare the efficiency of control chart by Weighted Variance Method, Scaled Weighted Variance Method, Empirical Quantiles Method and Extremevalue Theory for skewed populations. The efficiencies of control chart are determined by average run length. The control char ..."
Abstract
 Add to MetaCart
The objective of this study is to compare the efficiency of control chart by Weighted Variance Method, Scaled Weighted Variance Method, Empirical Quantiles Method and Extremevalue Theory for skewed populations. The efficiencies of control chart are determined by average run length. The control
A Multivariate EWMA Control Chart for Skewed Populations using Weighted Variance Method
, 2014
"... ..."
Arcing Classifiers
, 1998
"... Recent work has shown that combining multiple versions of unstable classifiers such as trees or neural nets results in reduced test set error. One of the more effective is bagging (Breiman [1996a] ) Here, modified training sets are formed by resampling from the original training set, classifiers con ..."
Abstract

Cited by 345 (6 self)
 Add to MetaCart
and the combining is done by weighted voting. Arcing is more successful than bagging in test set error reduction. We explore two arcing algorithms, compare them to each other and to bagging, and try to understand how arcing works. We introduce the definitions of bias and variance for a classifier as components
Robust minimum variance beamforming
 IEEE Transactions on Signal Processing
, 2005
"... Abstract—This paper introduces an extension of minimum variance beamforming that explicitly takes into account variation or uncertainty in the array response. Sources of this uncertainty include imprecise knowledge of the angle of arrival and uncertainty in the array manifold. In our method, uncerta ..."
Abstract

Cited by 107 (10 self)
 Add to MetaCart
Abstract—This paper introduces an extension of minimum variance beamforming that explicitly takes into account variation or uncertainty in the array response. Sources of this uncertainty include imprecise knowledge of the angle of arrival and uncertainty in the array manifold. In our method
Bias, Variance, and Arcing Classifiers
, 1996
"... Recent work has shown that combining multiple versions of unstable classifiers such as trees or neural nets results in reduced test set error. To study this, the concepts of bias and variance of a classifier are defined. Unstable classifiers can have universally low bias. Their problem is high varia ..."
Abstract

Cited by 105 (0 self)
 Add to MetaCart
Recent work has shown that combining multiple versions of unstable classifiers such as trees or neural nets results in reduced test set error. To study this, the concepts of bias and variance of a classifier are defined. Unstable classifiers can have universally low bias. Their problem is high
Results 1  10
of
3,900