Results 1  10
of
882,318
Adaptive Regularization of Weight Vectors
 Advances in Neural Information Processing Systems 22
, 2009
"... We present AROW, a new online learning algorithm that combines several useful properties: large margin training, confidence weighting, and the capacity to handle nonseparable data. AROW performs adaptive regularization of the prediction function upon seeing each new instance, allowing it to perform ..."
Abstract

Cited by 65 (14 self)
 Add to MetaCart
We present AROW, a new online learning algorithm that combines several useful properties: large margin training, confidence weighting, and the capacity to handle nonseparable data. AROW performs adaptive regularization of the prediction function upon seeing each new instance, allowing
Optimal Weight Vectors for Broadcast Channels
 in Proc. IEEE Asilomar Conf. on Signals, Systems & Computers
, 1996
"... In a multitransmitter broadcast system, the weight vector for each message signal can provide an additional degreeoffreedom for signal enhancement and interference suppression by taking advantage of the spatial diversity among the users. To date, the design of the optimal weight vectors which max ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
In a multitransmitter broadcast system, the weight vector for each message signal can provide an additional degreeoffreedom for signal enhancement and interference suppression by taking advantage of the spatial diversity among the users. To date, the design of the optimal weight vectors which
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 594 (53 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias
Sparse Bayesian Learning and the Relevance Vector Machine
, 2001
"... This paper introduces a general Bayesian framework for obtaining sparse solutions to regression and classication tasks utilising models linear in the parameters. Although this framework is fully general, we illustrate our approach with a particular specialisation that we denote the `relevance vec ..."
Abstract

Cited by 958 (5 self)
 Add to MetaCart
vector machine' (RVM), a model of identical functional form to the popular and stateoftheart `support vector machine' (SVM). We demonstrate that by exploiting a probabilistic Bayesian learning framework, we can derive accurate prediction models which typically utilise dramatically fewer
Training Support Vector Machines: an Application to Face Detection
, 1997
"... We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision sur ..."
Abstract

Cited by 728 (1 self)
 Add to MetaCart
We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision
Selection Weighted Vector Directional Filters
"... In this paper, a class of Weighted Vector Directional Filters (WVDFs) based on the selection of the output sample from the multichannel input set is analyzed and optimized. The WVDF output minimizes the sum of weighted angular distances to other input samples from the filtering window. Dependent o ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
In this paper, a class of Weighted Vector Directional Filters (WVDFs) based on the selection of the output sample from the multichannel input set is analyzed and optimized. The WVDF output minimizes the sum of weighted angular distances to other input samples from the filtering window. Dependent
Termweighting approaches in automatic text retrieval
 INFORMATION PROCESSING AND MANAGEMENT
, 1988
"... The experimental evidence accumulated over the past 20 years indicates that text indexing systems based on the assignment of appropriately weighted single terms produce retrieval results that are superior to those obtainable with other more elaborate text representations. These results depend crucia ..."
Abstract

Cited by 2159 (10 self)
 Add to MetaCart
The experimental evidence accumulated over the past 20 years indicates that text indexing systems based on the assignment of appropriately weighted single terms produce retrieval results that are superior to those obtainable with other more elaborate text representations. These results depend
Weight Vectors Of The Basic ...Module And The LittlewoodRichardson Rule
, 1995
"... The basic representation of A 1 is studied. The weight vectors are represented in terms of Schur functions. A suitable base of any weight space is given. LittlewoodRichardson rule appears in the linear relations among weight vectors. ..."
Abstract
 Add to MetaCart
The basic representation of A 1 is studied. The weight vectors are represented in terms of Schur functions. A suitable base of any weight space is given. LittlewoodRichardson rule appears in the linear relations among weight vectors.
Computing semantic relatedness using Wikipediabased explicit semantic analysis
 In Proceedings of the 20th International Joint Conference on Artificial Intelligence
, 2007
"... Computing semantic relatedness of natural language texts requires access to vast amounts of commonsense and domainspecific world knowledge. We propose Explicit Semantic Analysis (ESA), a novel method that represents the meaning of texts in a highdimensional space of concepts derived from Wikipedi ..."
Abstract

Cited by 546 (9 self)
 Add to MetaCart
Wikipedia. We use machine learning techniques to explicitly represent the meaning of any text as a weighted vector of Wikipediabased concepts. Assessing the relatedness of texts in this space amounts to comparing the corresponding vectors using conventional metrics (e.g., cosine). Compared
Opportunistic beamforming based on multiple weighting vectors
 IEEE Trans. Wireless Commun
, 2005
"... Abstractâ€”In order to improve the throughput of the opportunistic beamforming, the authors generalize the opportunistic beamforming by using multiple random weighting vectors at each time slot. The base station chooses the best weighting vector and performs the opportunistic beamforming with this op ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
Abstractâ€”In order to improve the throughput of the opportunistic beamforming, the authors generalize the opportunistic beamforming by using multiple random weighting vectors at each time slot. The base station chooses the best weighting vector and performs the opportunistic beamforming
Results 1  10
of
882,318