Results 1  10
of
4,043
Semantics of ContextFree Languages
 In Mathematical Systems Theory
, 1968
"... "Meaning " may be assigned to a string in a contextfree language by defining "attributes " of the symbols in a derivation tree for that string. The attributes can be defined by functions associated with each production in the grammar. This paper examines the implications of th ..."
Abstract

Cited by 569 (0 self)
 Add to MetaCart
"Meaning " may be assigned to a string in a contextfree language by defining "attributes " of the symbols in a derivation tree for that string. The attributes can be defined by functions associated with each production in the grammar. This paper examines the implications
An Efficient ContextFree Parsing Algorithm
, 1970
"... A parsing algorithm which seems to be the most efficient general contextfree algorithm known is described. It is similar to both Knuth's LR(k) algorithm and the familiar topdown algorithm. It has a time bound proportional to n 3 (where n is the length of the string being parsed) in general; i ..."
Abstract

Cited by 798 (0 self)
 Add to MetaCart
; it has an n 2 bound for unambiguous grammars; and it runs in linear time on a large class of grammars, which seems to include most practical contextfree programming language grammars. In an empirical comparison it appears to be superior to the topdown and bottomup algorithms studied by Griffiths
Statistical Parsing with a Contextfree Grammar and Word Statistics
, 1997
"... We describe a parsing system based upon a language model for English that is, in turn, based upon assigning probabilities to possible parses for a sentence. This model is used in a parsing system by finding the parse for the sentence with the highest probability. This system outperforms previou ..."
Abstract

Cited by 414 (18 self)
 Add to MetaCart
explain their relative performance. Introduction We present a statistical parser that induces its grammar and probabilities from a handparsed corpus (a treebank). Parsers induced from corpora are of interest both as simply exercises in machine learning and also because they are often the best parsers
Hierarchical phrasebased translation
 Computational Linguistics
, 2007
"... We present a statistical machine translation model that uses hierarchical phrases—phrases that contain subphrases. The model is formally a synchronous contextfree grammar but is learned from a parallel text without any syntactic annotations. Thus it can be seen as combining fundamental ideas from b ..."
Abstract

Cited by 597 (9 self)
 Add to MetaCart
We present a statistical machine translation model that uses hierarchical phrases—phrases that contain subphrases. The model is formally a synchronous contextfree grammar but is learned from a parallel text without any syntactic annotations. Thus it can be seen as combining fundamental ideas from
A hierarchical phrasebased model for statistical machine translation
 IN ACL
, 2005
"... We present a statistical phrasebased translation model that uses hierarchical phrases— phrases that contain subphrases. The model is formally a synchronous contextfree grammar but is learned from a bitext without any syntactic information. Thus it can be seen as a shift to the formal machinery of ..."
Abstract

Cited by 491 (12 self)
 Add to MetaCart
We present a statistical phrasebased translation model that uses hierarchical phrases— phrases that contain subphrases. The model is formally a synchronous contextfree grammar but is learned from a bitext without any syntactic information. Thus it can be seen as a shift to the formal machinery
Learning Stochastic Logic Programs
, 2000
"... Stochastic Logic Programs (SLPs) have been shown to be a generalisation of Hidden Markov Models (HMMs), stochastic contextfree grammars, and directed Bayes' nets. A stochastic logic program consists of a set of labelled clauses p:C where p is in the interval [0,1] and C is a firstorder r ..."
Abstract

Cited by 1194 (81 self)
 Add to MetaCart
Stochastic Logic Programs (SLPs) have been shown to be a generalisation of Hidden Markov Models (HMMs), stochastic contextfree grammars, and directed Bayes' nets. A stochastic logic program consists of a set of labelled clauses p:C where p is in the interval [0,1] and C is a first
Converting ContextFree Grammars To Constraint Dependency Grammars
, 1995
"... : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : viii 1. INTRODUCTION : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 1 2. CONSTRAINT DEPENDENCY PARSING : : : : : : : : : : : : : : : : : : : : : 3 2.1 Definition Of Maruyama's Constraint Depen ..."
Abstract
 Add to MetaCart
from GCFG : : : : : : : : : : : : : 10 2.3.3 Comparing Maruyama's techniques to those used in this thesis : : : 12 2.4 Summary : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : 13 3. CONVERTING A CONTEXTFREE GRAMMAR TO A CONSTRAINT DEPENDENCY GRAMMAR
Pfold: RNA secondary structure prediction using stochastic contextfree grammars
 Nucleic Acids Res
, 2003
"... RNA secondary structures are important in many biological processes and efficient structure prediction can give vital directions for experimental investigations. Many available programs for RNA secondary structure prediction only use a single sequence at a time. This may be sufficient in some applic ..."
Abstract

Cited by 213 (11 self)
 Add to MetaCart
RNA secondary structures are important in many biological processes and efficient structure prediction can give vital directions for experimental investigations. Many available programs for RNA secondary structure prediction only use a single sequence at a time. This may be sufficient in some applications, but often it is possible to obtain related RNA sequences with conserved secondary structure. These should be included in structural analyses to give improved results. This work presents a practical way of predicting RNA secondary structure that is especially useful when related sequences can be obtained. The method improves a previous algorithm based on an explicit evolutionary model and a probabilistic model of structures. Predictions can be done on a web server at
Treebank Grammars
 In Proc. of the 13th National Conference on Artificial Intelligence (AAAI1996
, 1996
"... By a “treebank grammar ” we mean a contextfree grammar created by reading the production rules directly from handparsed sentences in a tree bank. Common wisdom has it that such grammars do not perform we & though we know of no published data on the issue. The primary purpose of this paper is ..."
Abstract

Cited by 252 (4 self)
 Add to MetaCart
By a “treebank grammar ” we mean a contextfree grammar created by reading the production rules directly from handparsed sentences in a tree bank. Common wisdom has it that such grammars do not perform we & though we know of no published data on the issue. The primary purpose of this paper
Stochastic ContextFree Grammars for tRNA Modeling
, 1994
"... Stochastic contextfree grammars (SCFGs) are applied to the problems of folding, aligning and modeling families of tRNA sequences. SCFGs capture the sequences ' common primary and secondary structure and generalize the hidden Markov models (HMMs) used in related work on protein and DNA. Results ..."
Abstract

Cited by 156 (9 self)
 Add to MetaCart
Stochastic contextfree grammars (SCFGs) are applied to the problems of folding, aligning and modeling families of tRNA sequences. SCFGs capture the sequences ' common primary and secondary structure and generalize the hidden Markov models (HMMs) used in related work on protein and DNA
Results 1  10
of
4,043