Results 1  10
of
3,722,722
Loglinear Model Combination with Worddependent Scaling Factors
"... Loglinear model combination is the standard approach in LVCSR to combine several knowledge sources, usually an acoustic and a language model. Instead of using a single scaling factor per knowledge source, we make the scaling factor wordand pronunciationdependent. In this work, we combine three aco ..."
Abstract
 Add to MetaCart
Loglinear model combination is the standard approach in LVCSR to combine several knowledge sources, usually an acoustic and a language model. Instead of using a single scaling factor per knowledge source, we make the scaling factor wordand pronunciationdependent. In this work, we combine three
Parsing the WSJ using CCG and loglinear models
 In Proceedings of the 42nd Meeting of the ACL
, 2004
"... This paper describes and evaluates loglinear parsing models for Combinatory Categorial Grammar (CCG). A parallel implementation of the LBFGS optimisation algorithm is described, which runs on a Beowulf cluster allowing the complete Penn Treebank to be used for estimation. We also develop a new eff ..."
Abstract

Cited by 187 (22 self)
 Add to MetaCart
This paper describes and evaluates loglinear parsing models for Combinatory Categorial Grammar (CCG). A parallel implementation of the LBFGS optimisation algorithm is described, which runs on a Beowulf cluster allowing the complete Penn Treebank to be used for estimation. We also develop a new
An Efficient Boosting Algorithm for Combining Preferences
, 1999
"... The problem of combining preferences arises in several applications, such as combining the results of different search engines. This work describes an efficient algorithm for combining multiple preferences. We first give a formal framework for the problem. We then describe and analyze a new boosting ..."
Abstract

Cited by 707 (18 self)
 Add to MetaCart
The problem of combining preferences arises in several applications, such as combining the results of different search engines. This work describes an efficient algorithm for combining multiple preferences. We first give a formal framework for the problem. We then describe and analyze a new
Widecoverage efficient statistical parsing with CCG and loglinear models
 COMPUTATIONAL LINGUISTICS
, 2007
"... This paper describes a number of loglinear parsing models for an automatically extracted lexicalized grammar. The models are "full" parsing models in the sense that probabilities are defined for complete parses, rather than for independent events derived by decomposing the parse tree. Dis ..."
Abstract

Cited by 219 (43 self)
 Add to MetaCart
This paper describes a number of loglinear parsing models for an automatically extracted lexicalized grammar. The models are "full" parsing models in the sense that probabilities are defined for complete parses, rather than for independent events derived by decomposing the parse tree
Conditional LogLinear Structures for LogLinear
"... A loglinear modelling will take quite a long time if the data involves many variables and if we try to deal with all the variables at once. Fienberg and Kim (1999) investigated the relationship between loglinear model and its conditional, and we will show how this relationship is employed to make ..."
Abstract
 Add to MetaCart
A loglinear modelling will take quite a long time if the data involves many variables and if we try to deal with all the variables at once. Fienberg and Kim (1999) investigated the relationship between loglinear model and its conditional, and we will show how this relationship is employed
LogLinear Interpolation of Language Models
 in Proc. ICSLP'98
, 1998
"... A new method to combine language models is derived. This method of loglinear interpolation (LLI) is used for adaptation and for combining models of dierent context length. In both cases LLI is better than linear interpolation. 1 ..."
Abstract

Cited by 46 (6 self)
 Add to MetaCart
A new method to combine language models is derived. This method of loglinear interpolation (LLI) is used for adaptation and for combining models of dierent context length. In both cases LLI is better than linear interpolation. 1
Exploiting Generative Models in Discriminative Classifiers
 In Advances in Neural Information Processing Systems 11
, 1998
"... Generative probability models such as hidden Markov models provide a principled way of treating missing information and dealing with variable length sequences. On the other hand, discriminative methods such as support vector machines enable us to construct flexible decision boundaries and often resu ..."
Abstract

Cited by 538 (11 self)
 Add to MetaCart
result in classification performance superior to that of the model based approaches. An ideal classifier should combine these two complementary approaches. In this paper, we develop a natural way of achieving this combination by deriving kernel functions for use in discriminative methods such as support
Generalized Additive Models
, 1984
"... Likelihood based regression models, such as the normal linear regression model and the linear logistic model, assume a linear (or some other parametric) form for the covariate effects. We introduce the Local Scotinq procedure which replaces the liner form C Xjpj by a sum of smooth functions C Sj(Xj) ..."
Abstract

Cited by 2413 (46 self)
 Add to MetaCart
Likelihood based regression models, such as the normal linear regression model and the linear logistic model, assume a linear (or some other parametric) form for the covariate effects. We introduce the Local Scotinq procedure which replaces the liner form C Xjpj by a sum of smooth functions C Sj
Learning probabilistic relational models
 In IJCAI
, 1999
"... A large portion of realworld data is stored in commercial relational database systems. In contrast, most statistical learning methods work only with "flat " data representations. Thus, to apply these methods, we are forced to convert our data into a flat form, thereby losing much ..."
Abstract

Cited by 619 (31 self)
 Add to MetaCart
of the relational structure present in our database. This paper builds on the recent work on probabilistic relational models (PRMs), and describes how to learn them from databases. PRMs allow the properties of an object to depend probabilistically both on other properties of that object and on properties of related
LatticeBased Access Control Models
, 1993
"... The objective of this article is to give a tutorial on latticebased access control models for computer security. The paper begins with a review of Denning's axioms for information flow policies, which provide a theoretical foundation for these models. The structure of security labels in the ..."
Abstract

Cited by 1485 (56 self)
 Add to MetaCart
The objective of this article is to give a tutorial on latticebased access control models for computer security. The paper begins with a review of Denning's axioms for information flow policies, which provide a theoretical foundation for these models. The structure of security labels
Results 1  10
of
3,722,722