Results 1  10
of
2,322
Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
, 2001
"... Variable selection is fundamental to highdimensional statistical modeling, including nonparametric regression. Many approaches in use are stepwise selection procedures, which can be computationally expensive and ignore stochastic errors in the variable selection process. In this article, penalized ..."
Abstract

Cited by 948 (62 self)
 Add to MetaCart
functions are symmetric, nonconcave on (0, ∞), and have singularities at the origin to produce sparse solutions. Furthermore, the penalty functions should be bounded by a constant to reduce bias and satisfy certain conditions to yield continuous solutions. A new algorithm is proposed for optimizing
How does a brain build a cognitive code
 Psychological Review
, 1980
"... This article indicates how competition between afferent data and learned feedback expectancies can stabilize a developing code by buffering committed populations of detectors against continual erosion by new environmental demands. Tille gating phenomena that result lead to dynamically maintained cri ..."
Abstract

Cited by 253 (94 self)
 Add to MetaCart
matching process. The resonant state embodies the perceptual event, or attentional focus, and its amplified and sustained activities are capable of driving slow changes of longterm memor:r"' Mismatch between afferent data and efferent expectancies yields a global sulppression of activity
Policy invariance under reward transformations: Theory and application to reward shaping
 In Proceedings of the Sixteenth International Conference on Machine Learning
, 1999
"... This paper investigates conditions under which modifications to the reward function of a Markov decision process preserve the optimal policy. It is shown that, besides the positive linear transformation familiar from utility theory, one can add a reward for transitions between states that is express ..."
Abstract

Cited by 242 (8 self)
 Add to MetaCart
This paper investigates conditions under which modifications to the reward function of a Markov decision process preserve the optimal policy. It is shown that, besides the positive linear transformation familiar from utility theory, one can add a reward for transitions between states
GALERKIN FINITE ELEMENT APPROXIMATIONS OF STOCHASTIC ELLIPTIC PARTIAL DIFFERENTIAL EQUATIONS
, 2004
"... We describe and analyze two numerical methods for a linear elliptic problem with stochastic coefficients and homogeneous Dirichlet boundary conditions. Here the aim of the computations is to approximate statistical moments of the solution, and, in particular, we give a priori error estimates for the ..."
Abstract

Cited by 193 (11 self)
 Add to MetaCart
We describe and analyze two numerical methods for a linear elliptic problem with stochastic coefficients and homogeneous Dirichlet boundary conditions. Here the aim of the computations is to approximate statistical moments of the solution, and, in particular, we give a priori error estimates
Coil sensitivity encoding for fast MRI. In:
 Proceedings of the ISMRM 6th Annual Meeting,
, 1998
"... New theoretical and practical concepts are presented for considerably enhancing the performance of magnetic resonance imaging (MRI) by means of arrays of multiple receiver coils. Sensitivity encoding (SENSE) is based on the fact that receiver sensitivity generally has an encoding effect complementa ..."
Abstract

Cited by 193 (3 self)
 Add to MetaCart
complementary to Fourier preparation by linear field gradients. Thus, by using multiple receiver coils in parallel scan time in Fourier imaging can be considerably reduced. The problem of image reconstruction from sensitivity encoded data is formulated in a general fashion and solved for arbitrary coil
G: Models for Discrete Longitudinal Data
 and Chen 6 © 2012 by American Society of Clinical Oncology JOURNAL OF CLINICAL ONCOLOGY
"... This book covers a wide variety of statistical techniques for longitudinal data analysis. The authors, Geert Molenberghs and Geert Verbeke –both well known in this field – have extended their previous textbook (Verbeke and Molenberghs, 1997), mainly focused on linear mixed model for continuous data, ..."
Abstract

Cited by 172 (16 self)
 Add to MetaCart
data. Following sections are focussed on the special nonlinear models, showing and examining differences between the classes of marginal (Section II: Chapters 6 to 10), conditional (Section III: Chapters 11 and 12) and subjectspecific (Section IV: Chapters 13 to 16) models. In these sections
Avoiding Exponential Explosion: Generating Compact Verification Conditions
 SYMPOSIUM ON PRINCIPLES OF PROGRAMMING LANGUAGES
, 2001
"... Current verification condition (VC) generation algorithms, such as weakest preconditions, yield a VC whose size may be exponential in the size of the code fragment being checked. This paper describes a twostage VC generation algorithm that generates compact VCs whose size is worstcase quadratic in ..."
Abstract

Cited by 123 (7 self)
 Add to MetaCart
Current verification condition (VC) generation algorithms, such as weakest preconditions, yield a VC whose size may be exponential in the size of the code fragment being checked. This paper describes a twostage VC generation algorithm that generates compact VCs whose size is worstcase quadratic
Blocksparse signals: Uncertainty relations and efficient recovery
 IEEE TRANS. SIGNAL PROCESS
, 2010
"... We consider efficient methods for the recovery of blocksparse signals — i.e., sparse signals that have nonzero entries occurring in clusters—from an underdetermined system of linear equations. An uncertainty relation for blocksparse signals is derived, based on a blockcoherence measure, which we ..."
Abstract

Cited by 161 (17 self)
 Add to MetaCart
We consider efficient methods for the recovery of blocksparse signals — i.e., sparse signals that have nonzero entries occurring in clusters—from an underdetermined system of linear equations. An uncertainty relation for blocksparse signals is derived, based on a blockcoherence measure, which
The analysis of adaptation in a plantbreeding programme
 Australian Journal of Agricultural Research
, 1963
"... The adaptation of barley varieties was studied by the use of grain yields of a randomly chosen group of 277 varieties from a world collection, grown in replicated trials for several seasons at three sites in South Australia. For each variety a linear regression of yield on the mean yield of all vari ..."
Abstract

Cited by 146 (0 self)
 Add to MetaCart
The adaptation of barley varieties was studied by the use of grain yields of a randomly chosen group of 277 varieties from a world collection, grown in replicated trials for several seasons at three sites in South Australia. For each variety a linear regression of yield on the mean yield of all
Sets of matrices all infinite products of which converge. Linear Algebra and its Applications
, 1992
"... An infinite product IIT = lMi of matrices converges (on the right) if limi _ _ M,... Mi exists. A set Z = (Ai: i> l} of n X n matrices is called an RCP set (rightconvergent product set) if all infinite products with each element drawn from Z converge. Such sets of matrices arise in constructing ..."
Abstract

Cited by 114 (0 self)
 Add to MetaCart
An infinite product IIT = lMi of matrices converges (on the right) if limi _ _ M,... Mi exists. A set Z = (Ai: i> l} of n X n matrices is called an RCP set (rightconvergent product set) if all infinite products with each element drawn from Z converge. Such sets of matrices arise in constructing
Results 1  10
of
2,322