Results 1  10
of
1,189,121
Weighted Spaces
"... abstract. We prove existence of a new type of positive solutions of the semilinear equation −∆u+u = up on Rn, where 1 < p < n+2n−2. These solutions are bounded, but do not tend to zero at infinity. Indeed, they decay to zero away from three halflines with a common origin, and their asymptotic ..."
Abstract
 Add to MetaCart
abstract. We prove existence of a new type of positive solutions of the semilinear equation −∆u+u = up on Rn, where 1 < p < n+2n−2. These solutions are bounded, but do not tend to zero at infinity. Indeed, they decay to zero away from three halflines with a common origin, and their asymptotic profile is periodic along these halflines.
Explaining Away in Weight Space
, 2000
"... Explaining away has mostly been considered in terms of inference of states in belief networks. We show how it can also arise in a Bayesian context in inference about the weights governing relationships such as those between stimuli and reinforcers in conditioning experiments such as backward blo ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
blocking. We show how explaining away in weight space can be accounted for using an extension of a Kalman filter model; provide a new approximate way of looking at the Kalman gain matrix as a whitener for the correlation matrix of the observation process; suggest a network implementation
Explaining Away in Weight Space
"... Explaining away has mostly been considered in terms of inference of states in belief networks. We show how it can also arise in a Bayesian context in inference about the weights governing relationships such as those between stimuli and reinforcers in conditioning experiments such as backward blo ..."
Abstract
 Add to MetaCart
blocking. We show how explaining away in weight space can be accounted for using an extension of a Kalman filter model; provide a new approximate way of looking at the Kalman gain matrix as a whitener for the correlation matrix of the observation process; suggest a network implementation
Decay rates of semilinear viscoelastic systems in weighted spaces
"... Decay rates of semilinear viscoelastic systems in weighted spaces ..."
Clustering in Weight Space of Feedforward Nets
 ICANN 96, Lecture Notes in Computer Science 1112
, 1996
"... . We study symmetries of feedforward networks in terms of their corresponding groups and find that these groups naturally act on and partition weight space. We specify an algorithm to generate representative weight vectors in a specific fundamental domain. The analysis of the metric structure of the ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
. We study symmetries of feedforward networks in terms of their corresponding groups and find that these groups naturally act on and partition weight space. We specify an algorithm to generate representative weight vectors in a specific fundamental domain. The analysis of the metric structure
Computing semantic relatedness using Wikipediabased explicit semantic analysis
 In Proceedings of the 20th International Joint Conference on Artificial Intelligence
, 2007
"... Computing semantic relatedness of natural language texts requires access to vast amounts of commonsense and domainspecific world knowledge. We propose Explicit Semantic Analysis (ESA), a novel method that represents the meaning of texts in a highdimensional space of concepts derived from Wikipedi ..."
Abstract

Cited by 561 (9 self)
 Add to MetaCart
Wikipedia. We use machine learning techniques to explicitly represent the meaning of any text as a weighted vector of Wikipediabased concepts. Assessing the relatedness of texts in this space amounts to comparing the corresponding vectors using conventional metrics (e.g., cosine). Compared
Weight Space Learning Trajectory Visualization
, 1997
"... Visualizing the trajectory followed through weight space when a feedforward neural network is trained is made difficult by the very large dimensionality of the weight space in networks of practical size. A new approach, using Principal Component Analysis, is shown to be effective in a realistic lea ..."
Abstract
 Add to MetaCart
Visualizing the trajectory followed through weight space when a feedforward neural network is trained is made difficult by the very large dimensionality of the weight space in networks of practical size. A new approach, using Principal Component Analysis, is shown to be effective in a realistic
Weight Space Analysis and Forecast Uncertainty
"... The usage of location information of weight vectors can help to overcome deficiencies of gradient based learning for neural networks. We study the nontrivial structure of weight space, i.e., symmetries of feedforward networks in terms of their corresponding groups. ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
The usage of location information of weight vectors can help to overcome deficiencies of gradient based learning for neural networks. We study the nontrivial structure of weight space, i.e., symmetries of feedforward networks in terms of their corresponding groups.
Estimating the Support of a HighDimensional Distribution
, 1999
"... Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a "simple" subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified between 0 and 1. We propo ..."
Abstract

Cited by 781 (29 self)
 Add to MetaCart
of the weight vector in an associated feature space. The expansion coefficients are found by solving a quadratic programming problem, which we do by carrying out sequential optimization over pairs of input patterns. We also provide a preliminary theoretical analysis of the statistical performance of our
Results 1  10
of
1,189,121