Results 1  10
of
1,841,682
On Fuzzy Inference by the Least Squares Method*
"... A very different suggestion to the Zadeh's rule for fuzzy inference is the least squares method proposed by T. D. Pham and Valliappan. The aim of this paper is to study the possible contradictions appearing in this method and, additionally, to extend it when functions are not necessarily monoto ..."
Abstract
 Add to MetaCart
A very different suggestion to the Zadeh's rule for fuzzy inference is the least squares method proposed by T. D. Pham and Valliappan. The aim of this paper is to study the possible contradictions appearing in this method and, additionally, to extend it when functions are not necessarily
LeastSquares Policy Iteration
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2003
"... We propose a new approach to reinforcement learning for control problems which combines valuefunction approximation with linear architectures and approximate policy iteration. This new approach ..."
Abstract

Cited by 461 (12 self)
 Add to MetaCart
We propose a new approach to reinforcement learning for control problems which combines valuefunction approximation with linear architectures and approximate policy iteration. This new approach
Least Median of Squares Regression
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 1984
"... ..."
Benchmarking Least Squares Support Vector Machine Classifiers
 NEURAL PROCESSING LETTERS
, 2001
"... In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LSSVMs), a least squares cost function is proposed so as to obtain a linear set of eq ..."
Abstract

Cited by 446 (46 self)
 Add to MetaCart
In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LSSVMs), a least squares cost function is proposed so as to obtain a linear set
LSQR: An Algorithm for Sparse Linear Equations and Sparse Least Squares
 ACM Trans. Math. Software
, 1982
"... An iterative method is given for solving Ax ~ffi b and minU Ax b 112, where the matrix A is large and sparse. The method is based on the bidiagonalization procedure of Golub and Kahan. It is analytically equivalent to the standard method of conjugate gradients, but possesses more favorable numerica ..."
Abstract

Cited by 649 (21 self)
 Add to MetaCart
gradient algorithms, indicating that I~QR is the most reliable algorithm when A is illconditioned. Categories and Subject Descriptors: G.1.2 [Numerical Analysis]: ApprorJmationleast squares approximation; G.1.3 [Numerical Analysis]: Numerical Linear Algebralinear systems (direct and
Estimation and Inference in Econometrics
, 1993
"... The astonishing increase in computer performance over the past two decades has made it possible for economists to base many statistical inferences on simulated, or bootstrap, distributions rather than on distributions obtained from asymptotic theory. In this paper, I review some of the basic ideas o ..."
Abstract

Cited by 1151 (3 self)
 Add to MetaCart
The astonishing increase in computer performance over the past two decades has made it possible for economists to base many statistical inferences on simulated, or bootstrap, distributions rather than on distributions obtained from asymptotic theory. In this paper, I review some of the basic ideas
Least angle regression
 Ann. Statist
"... The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to s ..."
Abstract

Cited by 1308 (43 self)
 Add to MetaCart
implements the Lasso, an attractive version of ordinary least squares that constrains the sum of the absolute regression coefficients; the LARS modification calculates all possible Lasso estimates for a given problem, using an order of magnitude less computer time than previous methods. (2) A different LARS
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 800 (26 self)
 Add to MetaCart
all be understood in terms of exact or approximate forms of these variational representations. The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in largescale statistical models.
Direct least Square Fitting of Ellipses
, 1998
"... This work presents a new efficient method for fitting ellipses to scattered data. Previous algorithms either fitted general conics or were computationally expensive. By minimizing the algebraic distance subject to the constraint 4ac  b² = 1 the new method incorporates the ellipticity constraint ..."
Abstract

Cited by 421 (3 self)
 Add to MetaCart
This work presents a new efficient method for fitting ellipses to scattered data. Previous algorithms either fitted general conics or were computationally expensive. By minimizing the algebraic distance subject to the constraint 4ac  b² = 1 the new method incorporates the ellipticity constraint
Results 1  10
of
1,841,682