Results 1  10
of
1,897
Operatorvalued Kernels for Learning from Functional Response Data
, 2016
"... Abstract In this paper 1 we consider the problems of supervised classification and regression in the case where attributes and labels are functions: a data is represented by a set of functions, and the label is also a function. We focus on the use of reproducing kernel Hilbert space theory to learn ..."
Abstract
 Add to MetaCart
to learn from such functional data. Basic concepts and properties of kernelbased learning are extended to include the estimation of functionvalued functions. In this setting, the representer theorem is restated, a set of rigorously defined infinitedimensional operatorvalued kernels that can be valuably
Calibrating noise to sensitivity in private data analysis
 In Proceedings of the 3rd Theory of Cryptography Conference
, 2006
"... Abstract. We continue a line of research initiated in [10, 11] on privacypreserving statistical databases. Consider a trusted server that holds a database of sensitive information. Given a query function f mapping databases to reals, the socalled true answer is the result of applying f to the datab ..."
Abstract

Cited by 649 (60 self)
 Add to MetaCart
obtain separation results showing the increased value of interactive sanitization mechanisms over noninteractive. 1 Introduction We continue a line of research initiated in [10, 11] on privacy in statistical databases. A statistic is a quantity computed from a sample. Intuitively, if the database is a
Maxmargin Markov networks
, 2003
"... In typical classification tasks, we seek a function which assigns a label to a single object. Kernelbased approaches, such as support vector machines (SVMs), which maximize the margin of confidence of the classifier, are the method of choice for many such tasks. Their popularity stems both from the ..."
Abstract

Cited by 604 (15 self)
 Add to MetaCart
In typical classification tasks, we seek a function which assigns a label to a single object. Kernelbased approaches, such as support vector machines (SVMs), which maximize the margin of confidence of the classifier, are the method of choice for many such tasks. Their popularity stems both from
On kernel target alignment
 ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 14
, 2002
"... Kernel based methods are increasingly being used for data modeling because of their conceptual simplicity and outstanding performance on many tasks. However, the kernel function is often chosen using trialanderror heuristics. In this paper we address the problem of measuring the degree of agreem ..."
Abstract

Cited by 298 (8 self)
 Add to MetaCart
Kernel based methods are increasingly being used for data modeling because of their conceptual simplicity and outstanding performance on many tasks. However, the kernel function is often chosen using trialanderror heuristics. In this paper we address the problem of measuring the degree
Consistency of the group lasso and multiple kernel learning
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2007
"... We consider the leastsquare regression problem with regularization by a block 1norm, i.e., a sum of Euclidean norms over spaces of dimensions larger than one. This problem, referred to as the group Lasso, extends the usual regularization by the 1norm where all spaces have dimension one, where it ..."
Abstract

Cited by 274 (33 self)
 Add to MetaCart
are replaced by functions and reproducing kernel Hilbert norms, the problem is usually referred to as multiple kernel learning and is commonly used for learning from heterogeneous data sources and for non linear variable selection. Using tools from functional analysis, and in particular covariance operators
Online Learning with Multiple Operatorvalued Kernels
"... We consider the problem of learning a vectorvalued function f in an online learning setting. The function f is assumed to lie in a reproducing Hilbert space of operatorvalued kernels. We describe two online algorithms for learning f while taking into account the output structure. A first contri ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We consider the problem of learning a vectorvalued function f in an online learning setting. The function f is assumed to lie in a reproducing Hilbert space of operatorvalued kernels. We describe two online algorithms for learning f while taking into account the output structure. A first
Learning Multiple Tasks with Kernel Methods
 Journal of Machine Learning Research
, 2005
"... Editor: John ShaweTaylor We study the problem of learning many related tasks simultaneously using kernel methods and regularization. The standard singletask kernel methods, such as support vector machines and regularization networks, are extended to the case of multitask learning. Our analysis sh ..."
Abstract

Cited by 251 (10 self)
 Add to MetaCart
shows that the problem of estimating many task functions with regularization can be cast as a single task learning problem if a family of multitask kernel functions we define is used. These kernels model relations among the tasks and are derived from a novel form of regularizers. Specific kernels
Latent Semantic Models for Collaborative filtering
 ACM Trans. Information Systems
"... Collaborative filtering aims at learning predictive models of user preferences, interests or behavior from community data, that is, a database of available user preferences. In this article, we describe a new family of modelbased algorithms designed for this task. These algorithms rely on a statist ..."
Abstract

Cited by 331 (1 self)
 Add to MetaCart
Collaborative filtering aims at learning predictive models of user preferences, interests or behavior from community data, that is, a database of available user preferences. In this article, we describe a new family of modelbased algorithms designed for this task. These algorithms rely on a
Fields of experts: A framework for learning image priors
 In CVPR
, 2005
"... We develop a framework for learning generic, expressive image priors that capture the statistics of natural scenes and can be used for a variety of machine vision tasks. The approach extends traditional Markov Random Field (MRF) models by learning potential functions over extended pixel neighborhood ..."
Abstract

Cited by 292 (4 self)
 Add to MetaCart
neighborhoods. Field potentials are modeled using a ProductsofExperts framework that exploits nonlinear functions of many linear filter responses. In contrast to previous MRF approaches all parameters, including the linear filters themselves, are learned from training data. We demonstrate the capabilities
Functional Regularized Least Squares Classification with Operatorvalued Kernels
"... Although operatorvalued kernels have recently received increasing interest in various machine learning and functional data analysis problems such as multitask learning or functional regression, little attention has been paid to the understanding of their associated feature spaces. In this paper, w ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
Although operatorvalued kernels have recently received increasing interest in various machine learning and functional data analysis problems such as multitask learning or functional regression, little attention has been paid to the understanding of their associated feature spaces. In this paper
Results 1  10
of
1,897