Results 1  10
of
5,756
Active Learning with Statistical Models
, 1995
"... For manytypes of learners one can compute the statistically "optimal" way to select data. We review how these techniques have been used with feedforward neural networks [MacKay, 1992# Cohn, 1994]. We then showhow the same principles may be used to select data for two alternative, statist ..."
Abstract

Cited by 677 (12 self)
 Add to MetaCart
For manytypes of learners one can compute the statistically "optimal" way to select data. We review how these techniques have been used with feedforward neural networks [MacKay, 1992# Cohn, 1994]. We then showhow the same principles may be used to select data for two alternative, statisticallybased learning architectures: mixtures of Gaussians and locally weighted regression. While the techniques for neural networks are expensive and approximate, the techniques for mixtures of Gaussians and locally weighted regression are both efficient and accurate.
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 594 (53 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning t parameters, interference between old and new data, implementing locally weighted learning e ciently, and applications of locally weighted learning. A companion paper surveys how locally weighted learning can be used in robot learning and control.
Regularization Theory and Neural Networks Architectures
 Neural Computation
, 1995
"... We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Ba ..."
Abstract

Cited by 396 (33 self)
 Add to MetaCart
We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Basis Functions approximation schemes. This paper shows that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models and some of the neural networks. In particular, we introduce new classes of smoothness functionals that lead to different classes of basis functions. Additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. Furthermore, the same generalization that extends Radial Basis Functions (RBF) to Hyper Basis Functions (HBF) also leads from additive models to ridge approximation models, containing as special cases Breiman's hinge functions, som...
Graded Specht modules
"... Abstract. Recently, the first two authors have defined a Zgrading on group algebras of symmetric groups and more generally on the cyclotomic Hecke algebras of type G(l,1, d). In this paper we explain how to grade Specht modules over these algebras. 1. ..."
Abstract

Cited by 35 (8 self)
 Add to MetaCart
Abstract. Recently, the first two authors have defined a Zgrading on group algebras of symmetric groups and more generally on the cyclotomic Hecke algebras of type G(l,1, d). In this paper we explain how to grade Specht modules over these algebras. 1.
A SPECHT FILTRATION OF AN INDUCED SPECHT MODULE
, 903
"... To John Cannon and Derek Holt on the occasions of their significant birthdays, in recognition of their distinguished contributions to mathematics. ABSTRACT. Let Hn be a (degenerate or nondegenerate) Hecke algebra of type G(ℓ, 1, n), defined over a commutative ring R with one, and let S(µ) be a Spec ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Specht module for Hn. This paper shows that the induced Specht module S(µ)⊗Hn Hn+1 has an explicit Specht filtration. 1.
SYMMETRIC GROUP MODULES WITH SPECHT AND DUAL Specht Filtrations
, 2006
"... The author and Nakano recently proved that multiplicities in a Specht filtration of a symmetric group module are welldefined precisely when the characteristic is at least five. This result suggested the possibility of a symmetric group theory analogous to that of good filtrations and tilting modu ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
The author and Nakano recently proved that multiplicities in a Specht filtration of a symmetric group module are welldefined precisely when the characteristic is at least five. This result suggested the possibility of a symmetric group theory analogous to that of good filtrations and tilting
Reducible Specht modules
 J. Algebra
, 2004
"... This is the author’s version of a work that was accepted for publication in the ..."
Abstract

Cited by 23 (9 self)
 Add to MetaCart
This is the author’s version of a work that was accepted for publication in the
THE VARIETIES FOR SOME SPECHT MODULES
, 2009
"... J. Carlson introduced the cohomological and rank variety for a module over a finite group algebra. We give a general form for the largest component of the variety for the Specht module for the partition (p p) of p 2 restricted to a maximal elementary abelian psubgroup of rank p. We determine the va ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
J. Carlson introduced the cohomological and rank variety for a module over a finite group algebra. We give a general form for the largest component of the variety for the Specht module for the partition (p p) of p 2 restricted to a maximal elementary abelian psubgroup of rank p. We determine
Branching rules for Specht modules
 J. Algebra
"... Abstract. Let Σn be the symmetric group of degree n, and let F be a field of characteristic distinct from 2. Let Sλ F be the Specht module over FΣn corresponding to the partition λ of n. We find the indecomposable components of the restricted module S λ F ↓Σ n−1 and the induced module S λ F ↑Σ n+1. ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Abstract. Let Σn be the symmetric group of degree n, and let F be a field of characteristic distinct from 2. Let Sλ F be the Specht module over FΣn corresponding to the partition λ of n. We find the indecomposable components of the restricted module S λ F ↓Σ n−1 and the induced module S λ F ↑Σ n+1
On the cohomology of Specht modules
, 2006
"... We investigate the cohomology of the Specht module Sλ for the symmetric group Σd. We show if 0 i p−2, then Hi (Σd,Sλ) is isomorphic to Hs+i (B,w0 ·λ ′ −δ) where s = d(d−1)2, B is the Borel subgroup of the algebraic group GLd (k) and δ = (1d) is the weight of the determinant representation. We obta ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We investigate the cohomology of the Specht module Sλ for the symmetric group Σd. We show if 0 i p−2, then Hi (Σd,Sλ) is isomorphic to Hs+i (B,w0 ·λ ′ −δ) where s = d(d−1)2, B is the Borel subgroup of the algebraic group GLd (k) and δ = (1d) is the weight of the determinant representation. We
Results 1  10
of
5,756