Results 11 - 20
of
574
Analysis of Sparse Bayesian Learning
- Advances in Neural Information Processing Systems 14
, 2001
"... The recent introduction of the `relevance vector machine' has eectively demonstrated how sparsity may be obtained in generalised linear models within a Bayesian framework. Using a particular form of Gaussian parameter prior, `learning' is the maximisation, with respect to hyperparamete ..."
Abstract
-
Cited by 58 (1 self)
- Add to MetaCart
`sparsity criterion' is satis ed, this maximum is exactly equivalent to `pruning' the corresponding parameter from the model.
Subspace Information Criterion for Non-Quadratic Regularizers - Model Selection for Sparse Regressors
- IEEE Transactions on Neural Networks
, 2002
"... Non-quadratic regularizers, in particular the # 1 norm regularizer can yield sparse solutions that generalize well. In this work we propose the Generalized Subspace Information Criterion (GSIC) that allows to predict the generalization error for this useful family of regularizers. We show that un ..."
Abstract
-
Cited by 9 (7 self)
- Add to MetaCart
Non-quadratic regularizers, in particular the # 1 norm regularizer can yield sparse solutions that generalize well. In this work we propose the Generalized Subspace Information Criterion (GSIC) that allows to predict the generalization error for this useful family of regularizers. We show
Sparse modeling of landmark and texture variability using the orthomax criterion
- INTERNATIONAL SYMPOSIUM ON MEDICAL IMAGING 2006
, 2006
"... In the past decade, statistical shape modeling has been widely popularized in the medical image analysis community. Predominantly, principal component analysis (PCA) has been employed to model biological shape variability. Here, a reparameterization with orthogonal basis vectors is obtained such tha ..."
Abstract
-
Cited by 11 (1 self)
- Add to MetaCart
. Experimental results are given on chest radiographs, magnetic resonance images of the brain, and face images. Since pathologies are typically spatially localized, either with respect to shape or texture, we anticipate many medical applications where sparse parameterizations are preferable to the conventional
Fisher Discrimination Dictionary Learning for Sparse Representation
"... Sparse representation based classification has led to interesting image recognition results, while the dictionary used for sparse coding plays a key role in it. This paper presents a novel dictionary learning (DL) method to improve the pattern classification performance. Based on the Fisher discrimi ..."
Abstract
-
Cited by 72 (9 self)
- Add to MetaCart
discrimination criterion, a structured dictionary, whose dictionary atoms have correspondence to the class labels, is learned so that the reconstruction error after sparse coding can be used for pattern classification. Meanwhile, the Fisher discrimination criterion is imposed on the coding coefficients so
Dynamic Sparsing in Stiff Extrapolation Methods
, 1992
"... Based on a simple stability analysis for the semi--implicit Euler discretization a new dynamic sparsing procedure is derived. This procedure automatically eliminates "small" elements of the Jacobian matrix. As a consequence, the amount of work needed to handle the linear algebra within a s ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
semi--implicit extrapolation integrator can be reduced drastically. Within the course of integration the sparsing criterion, which decides what "small" means, is dynamically adapted to ensure stability of the discretization scheme. Thus, stepsize restrictions due to instability can be avoided
On Model Selection Criterion in Capture -Recapture Experiments with Sparse Data
"... ABSTRACT Model selection involving sparse data is always difficult, this is because with sparse data, quite different models can appear to fit adequately with highly diverse point, it is also almost impossible to test the underlying assumptions and select the "best" model". Some ecol ..."
Abstract
- Add to MetaCart
ecological as well as epidemiological experiments result in sparse data. In this paper we propose a modified Akaike information criterion (call it AICJ) for selecting models in capture-recapture experiments resulting in sparse data. The proposed criterion was compared with the Akaike Information Criterion
Sparse incremental regression modeling using correlation criterion with boosting search
- IEEE Signal Processing Letters
, 2005
"... Abstract—A novel technique is presented to construct sparse generalized Gaussian kernel regression models. The proposed method appends regressors in an incremental modeling by tuning the mean vector and diagonal covariance matrix of an individual Gaussian regressor to best fit the training data, bas ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
Abstract—A novel technique is presented to construct sparse generalized Gaussian kernel regression models. The proposed method appends regressors in an incremental modeling by tuning the mean vector and diagonal covariance matrix of an individual Gaussian regressor to best fit the training data
Sparse LS-SVMs using Additive Regularization with a Penalized Validation Criterion
"... This paper is based on a new way for determining the regularization trade-off in least squares support vector machines (LS-SVMs) via a mechanism of additive regularization which has been recently introduced in [6]. This framework enables computational fusion of training and validation levels and ..."
Abstract
- Add to MetaCart
and allows to train the model together with finding the regularization constants by solving a single linear system at once. In this paper we show that this framework allows to consider a penalized validation criterion that leads to sparse LS-SVMs. The model, regularization constants and sparseness follow
DICTIONARY LEARNING FOR SPARSE DECOMPOSITION: A NEW CRITERION AND ALGORITHM
"... During the last decade, there has been a growing interest toward the problem of sparse decomposition. A very important task in this field is dictionary learning, which is designing a suitable dictionary that can sparsely represent a group of training signals. In most dictionary learning algorithms, ..."
Abstract
- Add to MetaCart
During the last decade, there has been a growing interest toward the problem of sparse decomposition. A very important task in this field is dictionary learning, which is designing a suitable dictionary that can sparsely represent a group of training signals. In most dictionary learning algorithms
Coil sensitivity encoding for fast MRI. In:
- Proceedings of the ISMRM 6th Annual Meeting,
, 1998
"... New theoretical and practical concepts are presented for considerably enhancing the performance of magnetic resonance imaging (MRI) by means of arrays of multiple receiver coils. Sensitivity encoding (SENSE) is based on the fact that receiver sensitivity generally has an encoding effect complementa ..."
Abstract
-
Cited by 193 (3 self)
- Add to MetaCart
shape criterion is weaker in favor of the SNR. With both strategies the reconstruction algorithm is numerically demanding in the general case. This is mainly because with hybrid encoding the bulk of the work of reconstruction can usually not be done by fast Fourier transform (FFT). However, it is shown
Results 11 - 20
of
574