Results 11 - 20
of
133,236
Convex Analysis
, 1970
"... In this book we aim to present, in a unified framework, a broad spectrum of mathematical theory that has grown in connection with the study of problems of optimization, equilibrium, control, and stability of linear and nonlinear systems. The title Variational Analysis reflects this breadth. For a lo ..."
Abstract
-
Cited by 5411 (68 self)
- Add to MetaCart
long time, ‘variational ’ problems have been identified mostly with the ‘calculus of variations’. In that venerable subject, built around the minimization of integral functionals, constraints were relatively simple and much of the focus was on infinite-dimensional function spaces. A major theme
The Askey-scheme of hypergeometric orthogonal polynomials and its q-analogue
, 1998
"... We list the so-called Askey-scheme of hypergeometric orthogonal polynomials and we give a q-analogue of this scheme containing basic hypergeometric orthogonal polynomials. In chapter 1 we give the definition, the orthogonality relation, the three term recurrence relation, the second order differenti ..."
Abstract
-
Cited by 578 (6 self)
- Add to MetaCart
differential or difference equation, the forward and backward shift operator, the Rodrigues-type formula and generating functions of all classes of orthogonal polynomials in this scheme. In chapter 2 we give the limit relations between different classes of orthogonal polynomials listed in the Askey
Evaluating the Accuracy of Sampling-Based Approaches to the Calculation of Posterior Moments
- IN BAYESIAN STATISTICS
, 1992
"... Data augmentation and Gibbs sampling are two closely related, sampling-based approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accurac ..."
Abstract
-
Cited by 604 (12 self)
- Add to MetaCart
accuracy of the approximations to the expected value of functions of interest under the posterior. In this paper methods from spectral analysis are used to evaluate numerical accuracy formally and construct diagnostics for convergence. These methods are illustrated in the normal linear model
Notions of Computation and Monads
, 1991
"... The i.-calculus is considered a useful mathematical tool in the study of programming languages, since programs can be identified with I-terms. However, if one goes further and uses bn-conversion to prove equivalence of programs, then a gross simplification is introduced (programs are identified with ..."
Abstract
-
Cited by 867 (15 self)
- Add to MetaCart
with total functions from calues to values) that may jeopardise the applicability of theoretical results, In this paper we introduce calculi. based on a categorical semantics for computations, that provide a correct basis for proving equivalence of programs for a wide range of notions of computation.
The moderator–mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations
- Journal of Personality and Social Psychology
, 1986
"... In this article, we attempt to distinguish between the properties of moderator and mediator variables at a number of levels. First, we seek to make theorists and researchers aware of the importance of not using the terms moderator and mediator interchangeably by carefully elaborating, both conceptua ..."
Abstract
-
Cited by 5736 (8 self)
- Add to MetaCart
of this analysis is to distinguish between the properties of moderator and mediator variables in such a way as to clarify the different ways in which conceptual variables may account for differences in peoples ' behavior. Specifically, we differentiate between two often-confused functions of third variables
Hierarchical Reinforcement Learning with the MAXQ Value Function Decomposition
- Journal of Artificial Intelligence Research
, 2000
"... This paper presents a new approach to hierarchical reinforcement learning based on decomposing the target Markov decision process (MDP) into a hierarchy of smaller MDPs and decomposing the value function of the target MDP into an additive combination of the value functions of the smaller MDPs. Th ..."
Abstract
-
Cited by 443 (6 self)
- Add to MetaCart
This paper presents a new approach to hierarchical reinforcement learning based on decomposing the target Markov decision process (MDP) into a hierarchy of smaller MDPs and decomposing the value function of the target MDP into an additive combination of the value functions of the smaller MDPs
Object Detection with Discriminatively Trained Part Based Models
"... We describe an object detection system based on mixtures of multiscale deformable part models. Our system is able to represent highly variable object classes and achieves state-of-the-art results in the PASCAL object detection challenges. While deformable part models have become quite popular, their ..."
Abstract
-
Cited by 1422 (49 self)
- Add to MetaCart
, their value had not been demonstrated on difficult benchmarks such as the PASCAL datasets. Our system relies on new methods for discriminative training with partially labeled data. We combine a margin-sensitive approach for data-mining hard negative examples with a formalism we call latent SVM. A latent SVM
Symbolic Model Checking for Real-time Systems
- INFORMATION AND COMPUTATION
, 1992
"... We describe finite-state programs over real-numbered time in a guarded-command language with real-valued clocks or, equivalently, as finite automata with real-valued clocks. Model checking answers the question which states of a real-time program satisfy a branching-time specification (given in an ..."
Abstract
-
Cited by 578 (50 self)
- Add to MetaCart
We describe finite-state programs over real-numbered time in a guarded-command language with real-valued clocks or, equivalently, as finite automata with real-valued clocks. Model checking answers the question which states of a real-time program satisfy a branching-time specification (given
Nonparametric estimation of average treatment effects under exogeneity: a review
- REVIEW OF ECONOMICS AND STATISTICS
, 2004
"... Recently there has been a surge in econometric work focusing on estimating average treatment effects under various sets of assumptions. One strand of this literature has developed methods for estimating average treatment effects for a binary treatment under assumptions variously described as exogen ..."
Abstract
-
Cited by 630 (25 self)
- Add to MetaCart
as exogeneity, unconfoundedness, or selection on observables. The implication of these assumptions is that systematic (for example, average or distributional) differences in outcomes between treated and control units with the same values for the covariates are attributable to the treatment. Recent analysis has
Entropy-Based Algorithms For Best Basis Selection
- IEEE Transactions on Information Theory
, 1992
"... pretations (position, frequency, and scale), and we have experimented with feature-extraction methods that use best-basis compression for front-end complexity reduction. The method relies heavily on the remarkable orthogonality properties of the new libraries. It is obviously a nonlinear transformat ..."
Abstract
-
Cited by 675 (20 self)
- Add to MetaCart
transformation to represent a signal in its own best basis, but since the transformation is orthogonal once the basis is chosen, compression via the best-basis method is not drastically affected by noise: the noise energy in the transform values cannot exceed the noise energy in the original signal. Furthermore
Results 11 - 20
of
133,236