Results 1  10
of
64
Experiments with Reinforcement Learning in Problems with Continuous State and Action Spaces
, 1996
"... A key element in the solution of reinforcement learning problems is the value function. The purpose of this function is to measure the longterm utility or value of any given state and it is important because an agent can use it to decide what to do next. A common problem in reinforcement learning w ..."
Abstract

Cited by 115 (6 self)
 Add to MetaCart
, state and action spaces. In particular, we discuss the benefits of using sparse coarsecoded funct...
Generalization in Reinforcement Learning: Successful Examples Using Sparse Coarse Coding
 Advances in Neural Information Processing Systems 8
, 1996
"... On large problems, reinforcement learning systems must use parameterized function approximators such as neural networks in order to generalize between similar situations and actions. In these cases there are no strong theoretical results on the accuracy of convergence, and computational results have ..."
Abstract

Cited by 433 (20 self)
 Add to MetaCart
the control tasks they attempted, and for one that is significantly larger. The most important differences are that we used sparsecoarsecoded function approximators (CMACs) whereas they used mostly global function approximators, and that we learned online whereas they learned offline. Boyan and Moore
On sparse sets with the Green function of the highest smoothness, Comput. Methods Funct
 Theory
"... Abstract. Let E be a regular compact subset of the real line, let gC\E(z,∞) be the Green function of the complement of E with respect to the extended complex plane C with pole at ∞. We construct two examples of sets E of the minimum Hausdorff dimension with gC\E satisfying the Hölder condition with ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. Let E be a regular compact subset of the real line, let gC\E(z,∞) be the Green function of the complement of E with respect to the extended complex plane C with pole at ∞. We construct two examples of sets E of the minimum Hausdorff dimension with gC\E satisfying the Hölder condition with p = 1/2 either uniformly or locally.
Sparse Block–Jacobi Matrices with Exact Hausdorff Dimension
, 908
"... We show that the Hausdorff dimension of the spectral measure of a class of deterministic, i. e. nonrandom, block–Jacobi matrices may be determined exactly, improving a result of Zlatoˇs (J. Funct. Anal. 207, 216252 (2004)). 1 ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We show that the Hausdorff dimension of the spectral measure of a class of deterministic, i. e. nonrandom, block–Jacobi matrices may be determined exactly, improving a result of Zlatoˇs (J. Funct. Anal. 207, 216252 (2004)). 1
Sparse Matrix Decompositions and Graph Characterizations
"... The question of when zeros (i.e., sparsity) in a positive definite matrix A are preserved in its Cholesky decomposition, and vice versa, was addressed by Paulsen et al. [19] [see Journ. of Funct. Anal., 85, 151178]. In particular, they prove that for the pattern of zeros in A to be retained in the ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The question of when zeros (i.e., sparsity) in a positive definite matrix A are preserved in its Cholesky decomposition, and vice versa, was addressed by Paulsen et al. [19] [see Journ. of Funct. Anal., 85, 151178]. In particular, they prove that for the pattern of zeros in A to be retained
Hierarchical Sparse Coded Surface Models
"... AbstractIn this paper, we describe a novel approach to construct textured 3D environment models in a hierarchical fashion based on local surface patches. Compared to previous approaches, the hierarchy enables our method to represent the environment with differently sized surface patches. The recon ..."
Abstract
 Add to MetaCart
with large variations at high resolution. In addition, we compactly describe local surface attributes via sparse coding based on an overcomplete dictionary. In this way, we additionally exploit similarities in structure and texture, which leads to compact models. We learn the dictionary directly from
4. TITLE AND SUBTITLE Collaborative Hierarchical Sparse Modeling
, 2010
"... Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments ..."
Abstract
 Add to MetaCart
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 222024302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number.
Improving The Numerical Stability And The Performance Of A Parallel Sparse Solver
 Computers Math. Applic
"... Coarse grain parallel codes for solving sparse systems of linear algebraic equations can be developed in several different ways. The following procedure is suitable for some parallel computers. A preliminary reordering of the matrix is first applied to move as many zero elements as possible to the l ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Coarse grain parallel codes for solving sparse systems of linear algebraic equations can be developed in several different ways. The following procedure is suitable for some parallel computers. A preliminary reordering of the matrix is first applied to move as many zero elements as possible
Higherorderstructureof naturalimages
"... We present a statistical model for learning efficient codes of higherorder structure in natural images. The model, a nonlinear generalization of independent component analysis, replaces the standard assumption of independence for the joint distribution of coefficients with a distribution that is a ..."
Abstract
 Add to MetaCart
that is adapted to the variance structure of the coefficients of an efficient imagebasis. Thisoffersanoveldescriptionofhigherorderimagestructure and provides a way to learn coarsecoded, sparsedistributed representations of abstract image properties such as object location, scale, and texture. 1
Compile time transformations for sparse matrix computation used in PERMAS to improve locality
, 1998
"... Automatic scheduling in parallel/distributed systems for coarse grained irregular problems such as sparse matrix factorization is challenging since it requires efficient runtime support to execute it. In the literature there are importants contributions about parallelization for this kind of proble ..."
Abstract
 Add to MetaCart
Automatic scheduling in parallel/distributed systems for coarse grained irregular problems such as sparse matrix factorization is challenging since it requires efficient runtime support to execute it. In the literature there are importants contributions about parallelization for this kind
Results 1  10
of
64