Results 1 - 10
of
561
Incremental Variant Control
, 1988
"... This paper introduces a model of handling Programming-Variations-in-the-Small (PVITS, see [8]) on the basis of program fragments. An abstract syntax of the language to describe variant configurations is automatically derived from the fragment dependency relation. We generate an interactive user inte ..."
Abstract
- Add to MetaCart
This paper introduces a model of handling Programming-Variations-in-the-Small (PVITS, see [8]) on the basis of program fragments. An abstract syntax of the language to describe variant configurations is automatically derived from the fragment dependency relation. We generate an interactive user
A View Of The Em Algorithm That Justifies Incremental, Sparse, And Other Variants
- Learning in Graphical Models
, 1998
"... . The EM algorithm performs maximum likelihood estimation for data in which some variables are unobserved. We present a function that resembles negative free energy and show that the M step maximizes this function with respect to the model parameters and the E step maximizes it with respect to the d ..."
Abstract
-
Cited by 993 (18 self)
- Add to MetaCart
to the distribution over the unobserved variables. From this perspective, it is easy to justify an incremental variant of the EM algorithm in which the distribution for only one of the unobserved variables is recalculated in each E step. This variant is shown empirically to give faster convergence in a mixture
Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm
- Machine Learning
, 1988
"... learning Boolean functions, linear-threshold algorithms Abstract. Valiant (1984) and others have studied the problem of learning various classes of Boolean functions from examples. Here we discuss incremental learning of these functions. We consider a setting in which the learner responds to each ex ..."
Abstract
-
Cited by 773 (5 self)
- Add to MetaCart
learning Boolean functions, linear-threshold algorithms Abstract. Valiant (1984) and others have studied the problem of learning various classes of Boolean functions from examples. Here we discuss incremental learning of these functions. We consider a setting in which the learner responds to each
Analysis of Some Incremental Variants of Policy Iteration: First Steps Toward Understanding Actor-Critic Learning Systems
, 1993
"... This paper studies algorithms based on an incremental dynamic programming abstraction of one of the key issues in understanding the behavior of actor-critic learning systems. The prime example of such a learning system is the ASE/ACE architecture introduced by Barto, Sutton, and Anderson (1983). Als ..."
Abstract
-
Cited by 32 (0 self)
- Add to MetaCart
This paper studies algorithms based on an incremental dynamic programming abstraction of one of the key issues in understanding the behavior of actor-critic learning systems. The prime example of such a learning system is the ASE/ACE architecture introduced by Barto, Sutton, and Anderson (1983
Incremental parsing with the perceptron algorithm
- In ACL
, 2004
"... This paper describes an incremental parsing approach where parameters are estimated using a variant of the perceptron algorithm. A beam-search algorithm is used during both training and decoding phases of the method. The perceptron approach was implemented with the same feature set as that of an exi ..."
Abstract
-
Cited by 177 (4 self)
- Add to MetaCart
This paper describes an incremental parsing approach where parameters are estimated using a variant of the perceptron algorithm. A beam-search algorithm is used during both training and decoding phases of the method. The perceptron approach was implemented with the same feature set
Theory Refinement on Bayesian Networks
, 1991
"... Theory refinement is the task of updating a domain theory in the light of new cases, to be done automatically or with some expert assistance. The problem of theory refinement under uncertainty is reviewed here in the context of Bayesian statistics, a theory of belief revision. The problem is reduced ..."
Abstract
-
Cited by 255 (5 self)
- Add to MetaCart
refined from data. Algorithms for refinement of Bayesian networks are presented to illustrate what is meant by "partial theory", "alternative theory representation ", etc. The algorithms are an incremental variant of batch learning algorithms from the literature so can work well
Incremental Subgradient Methods For Nondifferentiable Optimization
, 2001
"... We consider a class of subgradient methods for minimizing a convex function that consists of the sum of a large number of component functions. This type of minimization arises in a dual context from Lagrangian relaxation of the coupling constraints of large scale separable problems. The idea is to p ..."
Abstract
-
Cited by 124 (10 self)
- Add to MetaCart
squares problems, such as those arising in the training of neural networks, and it has resulted in a much better practical rate of convergence than the steepest descent method. In this paper, we establish the convergence properties of a number of variants of incremental subgradient methods, including some
Algorithms for Deterministic Incremental Dependency Parsing
- Computational Linguistics
, 2008
"... Parsing algorithms that process the input from left to right and construct a single derivation have often been considered inadequate for natural language parsing because of the massive ambiguity typically found in natural language grammars. Nevertheless, it has been shown that such algorithms, combi ..."
Abstract
-
Cited by 116 (20 self)
- Add to MetaCart
, combined with treebank-induced classifiers, can be used to build highly accurate disambiguating parsers, in particular for dependency-based syntactic representations. In this article, we first present a general framework for describing and analyzing algorithms for deterministic incremental dependency
Incremental concept formation algorithms based on Galois (concept) lattices
, 1995
"... . The Galois (or concept) lattice produced from a binary relation has been proved useful for many applications. Building the Galois lattice can be considered as a conceptual clustering method since it results in a concept hierarchy. This article presents incremental algorithms for updating the Galoi ..."
Abstract
-
Cited by 132 (9 self)
- Add to MetaCart
algorithms to three other batch algorithms. Surprisingly, when the total time for incremental generation is used, the simplest and less efficient variant of the incremental algorithms outperforms the batch algorithms in most cases. When only the incremental update time is used, the incremental algorithm
Incremental Nonparametric Bayesian Regression
"... Abstract. In this paper we develop an incremental estimation algorithm for infinite mixtures of Gaussian process experts. Incremental, local, non-linear regression algorithms are required for a wide variety of applications, ranging from robotic control to neural decoding. Arguably the most popular a ..."
Abstract
-
Cited by 6 (3 self)
- Add to MetaCart
and widely used of such algorithms is currently Locally Weighted Projection Regression (LWPR) which has been shown empirically to be both computationally efficient and sufficiently accurate for a number of applications. While incremental variants of non-linear Bayesian regression models have superior
Results 1 - 10
of
561