Results 1  10
of
352,775
An Analysis Of The Exponentiated Gradient Descent Algorithm
"... This paper analyses three algorithms recently studied in the Computational Learning Theory community: the Gradient Descent (GD) Algorithm, the Exponentiated Gradient Algorithm with Positive and Negative weights (EG algorithm) and the Exponentiated Gradient Algorithm with Unnormalised Positive and Ne ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
This paper analyses three algorithms recently studied in the Computational Learning Theory community: the Gradient Descent (GD) Algorithm, the Exponentiated Gradient Algorithm with Positive and Negative weights (EG algorithm) and the Exponentiated Gradient Algorithm with Unnormalised Positive
Multiplegradient descent algorithm (MGDA
 Research Report 6953, INRIA (2009). URL: http://hal.inria.fr/inria00389811/fr/ J.A. Désidéri
"... Abstract: In a previous report [4], a methodology for the numerical treatment of a twoobjective optimization problem, possibly subject to equality constraints, was proposed. The method was devised to be adapted to cases where an initial designpoint is known and such that one of the two disciplines ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
cooperativeoptimization phase throughout which all the criteria improve, by a socalled MultipleGradient Descent Algorithm (MGDA), which generalizes to n disciplines (n ≥ 2) the classical steepestdescent method. This phase is conducted until a designpoint on the Pareto set is reached; then
and tested: Regular Batch Gradient Descent Algorithm, Regularized Gradient Descent Algorithm and
, 2008
"... The goal of the spam filtering problem is to identify an email as a spam or not spam. One of the classic techniques used in spam filtering is to predict using logistic regression. Words that frequently occur in a spam email are used as the feature set in the regression problem. This in this report, ..."
Abstract
 Add to MetaCart
The goal of the spam filtering problem is to identify an email as a spam or not spam. One of the classic techniques used in spam filtering is to predict using logistic regression. Words that frequently occur in a spam email are used as the feature set in the regression problem. This in this report, we examine some of the different techniques used for minimizing the logistic loss function and provide a performance analysis of the differnt techniques. Specifically three diffrent types of minimization techniques were implemented
Stochastic Gradient Descent Algorithm in the Computational Network Toolkit
"... We introduce the stochastic gradient descent algorithm used in the computational network toolkit (CNTK) — a general purpose machine learning toolkit written in C++ for training and using models that can be expressed as a computational network. We describe the algorithm used to compute the gradients ..."
Abstract
 Add to MetaCart
We introduce the stochastic gradient descent algorithm used in the computational network toolkit (CNTK) — a general purpose machine learning toolkit written in C++ for training and using models that can be expressed as a computational network. We describe the algorithm used to compute
Two Gradient Descent Algorithms for Blind Signal Separation
 IN PROCEEDINGS OF ICANN96, THE LECTURE NOTES IN COMPUTER SCIENCE VOL.1112
, 1996
"... Two algorithms are derived based on the natural gradient of the mutual information of the linear transformed mixtures. These algorithms can be easily implemented on a neural network like system. Two performance functions are introduced based on the two approximation methods for evaluating the mutual ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
Two algorithms are derived based on the natural gradient of the mutual information of the linear transformed mixtures. These algorithms can be easily implemented on a neural network like system. Two performance functions are introduced based on the two approximation methods for evaluating
MUTIPLEGRADIENT DESCENT ALGORITHM FOR MULTIOBJECTIVE OPTIMIZATION
, 2012
"... Abstract. The steepestdescent method is a wellknown and effective singleobjective descent algorithm when the gradient of the objective function is known. Here, we propose a particular generalization of this method to multiobjective optimization by considering the concurrent minimization of n smo ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. The steepestdescent method is a wellknown and effective singleobjective descent algorithm when the gradient of the objective function is known. Here, we propose a particular generalization of this method to multiobjective optimization by considering the concurrent minimization of n
Parallel tracing of multiple trajectories in gradient descent algorithm with Cell Broadband Engine
"... Explored here is the ability of Cell B.E. to efficiently reveal viable solutions of nonlinear function approximation with multilayer perceptron (MLP) employing gradient descent algorithm. The capacity of Cell BE to asynchronously trace several trajectories of implemented gradient descent algorithm f ..."
Abstract
 Add to MetaCart
Explored here is the ability of Cell B.E. to efficiently reveal viable solutions of nonlinear function approximation with multilayer perceptron (MLP) employing gradient descent algorithm. The capacity of Cell BE to asynchronously trace several trajectories of implemented gradient descent algorithm
Application of Genetic and Gradient Descent Algorithms to Wavefront Compensation for DeepSpace
 Jet Propulsion Laboratory
, 2005
"... Present adaptive optics systems use a wavefront sensor to detect phase errors in the incoming wavefront. Knowledge of these phase errors then is used to correct the incoming wavefront, reducing image distortion. However, these systems require that a portion of the incoming light be diverted to the ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
sensor in order to detect and correct wavefront errors. Two common stochastic optimization techniques—genetic algorithms and gradient descent algorithms—are evaluated in this article.2 Although these algorithms are promising, further work is necessary to enable them to be used in practical adaptive
A Fully Adaptive Normalized Nonlinear Gradient Descent Algorithm
"... A fully adaptive normalized nonlinear gradient descent (FANNGD) algorithm for neural adaptive filters employed for nonlinear system identification is proposed. This full adaptation is achieved using the instantaneous squared prediction error to adapt the free parameter of the NNGD algorithm. The con ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
A fully adaptive normalized nonlinear gradient descent (FANNGD) algorithm for neural adaptive filters employed for nonlinear system identification is proposed. This full adaptation is achieved using the instantaneous squared prediction error to adapt the free parameter of the NNGD algorithm
Results 1  10
of
352,775