Results 1 - 10
of
1,080
An empirical comparison of voting classification algorithms: Bagging, boosting, and variants.
- Machine Learning,
, 1999
"... Abstract. Methods for voting classification algorithms, such as Bagging and AdaBoost, have been shown to be very successful in improving the accuracy of certain classifiers for artificial and real-world datasets. We review these algorithms and describe a large empirical study comparing several vari ..."
Abstract
-
Cited by 707 (2 self)
- Add to MetaCart
the variance for Naive-Bayes, which was very stable. We observed that Arc-x4 behaves differently than AdaBoost if reweighting is used instead of resampling, indicating a fundamental difference. Voting variants, some of which are introduced in this paper, include: pruning versus no pruning, use of probabilistic
Multiresolution grayscale and rotation invariant texture classification with local binary patterns
- IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
, 2002
"... This paper presents a theoretically very simple, yet efficient, multiresolution approach to gray-scale and rotation invariant texture classification based on local binary patterns and nonparametric discrimination of sample and prototype distributions. The method is based on recognizing that certain ..."
Abstract
-
Cited by 1299 (39 self)
- Add to MetaCart
that certain local binary patterns, termed "uniform," are fundamental properties of local image texture and their occurrence histogram is proven to be a very powerful texture feature. We derive a generalized gray-scale and rotation invariant operator presentation that allows for detecting the "
The processing-speed theory of adult age differences in cognition
- Psychological Review
, 1996
"... A theory is proposed to account for some of the age-related differences reported in measures of Type A or fluid cognition. The central hypothesis in the theory is that increased age in adulthood is associated with a decrease in the speed with which many processing operations can be executed and that ..."
Abstract
-
Cited by 416 (2 self)
- Add to MetaCart
the products of early processing may no longer be available when later processing is complete (simultaneity). Several types of evidence, such as the discovery of con-siderable shared age-related variance across various measures of speed and large attenuation of the age-related influences on cognitive measures
Proof of a Fundamental Result in Self-Similar Traffic Modeling
- COMPUTER COMMUNICATION REVIEW
, 1997
"... We state and prove the following key mathematical result in self-similar traffic modeling: the superposition of many ON/OFF sources (also known as packet trains) with strictly alternating ON- and OFF-periods and whose ON-periods or OFF-periods exhibit the Noah Effect (i.e., have high variability or ..."
Abstract
-
Cited by 290 (8 self)
- Add to MetaCart
or infinite variance) can produce aggregate network traffic that exhibits the Joseph Effect (i.e., is self-similar or long-range dependent). There is, moreover, a simple relation between the parameters describing the intensities of the Noah Effect (high variability) and the Joseph Effect (self
Near-optimal sensor placements in gaussian processes
- In ICML
, 2005
"... When monitoring spatial phenomena, which can often be modeled as Gaussian processes (GPs), choosing sensor locations is a fundamental task. There are several common strategies to address this task, for example, geometry or disk models, placing sensors at the points of highest entropy (variance) in t ..."
Abstract
-
Cited by 342 (34 self)
- Add to MetaCart
When monitoring spatial phenomena, which can often be modeled as Gaussian processes (GPs), choosing sensor locations is a fundamental task. There are several common strategies to address this task, for example, geometry or disk models, placing sensors at the points of highest entropy (variance
The Sources of Variance in the Helsinki Stock Exchange: An Investigation of the Fundamental and the Transitory Variance
"... We investigate the relationship between the fundamental and the transitory variance. We study how the transitory variance deviates from the fundamental variance. We use the Gonzalo and Granger (1995) permanent-temporary approach to decompose the variance common factor into a transitory and a permane ..."
Abstract
- Add to MetaCart
We investigate the relationship between the fundamental and the transitory variance. We study how the transitory variance deviates from the fundamental variance. We use the Gonzalo and Granger (1995) permanent-temporary approach to decompose the variance common factor into a transitory and a
Simultaneous conjoint measurement: a new type of fundamental measurement
- JOURNAL OF MATHEMATICAL PSYCHOLOGY
, 1964
"... The essential character of what is classically considered, e.g., by N. R. Campbell, the fundamental measurement of extensive quantities is described by an axiomatization for the comparison of effects of (or responses to) arbitrary combinations of “quantities” of a single specified kind. For example, ..."
Abstract
-
Cited by 168 (9 self)
- Add to MetaCart
The essential character of what is classically considered, e.g., by N. R. Campbell, the fundamental measurement of extensive quantities is described by an axiomatization for the comparison of effects of (or responses to) arbitrary combinations of “quantities” of a single specified kind. For example
What makes investors trade?,
- Journal of Finance
, 2001
"... ABSTRACT What Makes Investors Trade? A unique data set allows us to monitor the buys, sells, and holds of individuals and institutions in the Finnish stock market on a daily basis. With this data set, we employ Logit regressions to identify the determinants of buying and selling activity over a two ..."
Abstract
-
Cited by 204 (13 self)
- Add to MetaCart
be equally useful to learn if more rational motivations, such as portfolio rebalancing consistent with mean-variance theory, tax loss trading, and life cycle considerations are the fundamental drivers of trade. Up until now, the empirical analysis of what makes investors trade has been hindered by limited
The Analysis of Variance Error Part II: Fundamental Principles
- IEEE TRANSACTIONS ON AUTOMATIC CONTROL
, 2001
"... This paper presents the theoretical underpinnings for a preceding companion work in which new improved accuracy quantifications for noise induced estimation errors were presented. In particular, via the ideas of reproducing kernels and orthonormal parameterisations of the subspaces they represent, t ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
, this paper develops new methods for evaluating certain quadratic forms in inverse Toeplitz matrices that are instrumental to the quantification of variance error. Additionally, new results on the convergence rates of generalised Fourier expansions are derived and then employed to derive necessary
Coil sensitivity encoding for fast MRI. In:
- Proceedings of the ISMRM 6th Annual Meeting,
, 1998
"... New theoretical and practical concepts are presented for considerably enhancing the performance of magnetic resonance imaging (MRI) by means of arrays of multiple receiver coils. Sensitivity encoding (SENSE) is based on the fact that receiver sensitivity generally has an encoding effect complementa ..."
Abstract
-
Cited by 193 (3 self)
- Add to MetaCart
or collimation but by spectral analysis. The idea of Lauterbur (1) to encode object contrast in the resonance spectrum by a magnetic field gradient forms the exclusive basis of signal localization in Fourier imaging. However powerful, the gradient-encoding concept implies a fundamental restriction. Only one
Results 1 - 10
of
1,080