Turning on the Turbo: Fast Third-Order Non-Projective Turbo Parsers
Citations
702 citations
616 citations
Cites methods from "Turning on the Turbo: Fast Third-Or..."
...The following external resources were used: part-of-speech tags and extra syntactic dependency information obtained with TurboTagger and TurboParser (Martins et al., 2013), trained on the Penn Treebank (for English) and on the version of the German TIGER corpus used in the SPMRL shared task (Seddah et al., 2014) for German....
[...]
...The following external resources were used: part-of-speech tags and extra syntactic dependency information obtained with TurboTagger and TurboParser (Martins et al., 2013), trained on the Penn Treebank (for English) and on the version of the German TIGER corpus used in the SPMRL shared task (Seddah…...
[...]
...The syntactic dependencies are predicted with TurboParser trained on the TIGER German treebank....
[...]
...The following external resources were used: part-of-speech tags and extra syntactic dependency information obtained with TurboTagger and TurboParser (Martins et al., 2013), trained on the Penn Treebank (for English) and on the version of the German TIGER corpus used in the SPMRL shared task (Seddah et al....
[...]
342 citations
Cites methods from "Turning on the Turbo: Fast Third-Or..."
...TurboParser (Martins et al., 2013) uses AD3 (Martins et al....
[...]
...TurboParser (Martins et al., 2013) uses AD3 (Martins et al., 2011), a type of augmented Lagrangian relaxation, to integrate third-order features into a CLE backbone....
[...]
281 citations
227 citations
Cites methods from "Turning on the Turbo: Fast Third-Or..."
...For parsing, we start with TurboParser, which is open-source and has been found to perform well on a range of parsing problems in different languages (Martins et al., 2013; Kong and Smith, 2014)....
[...]
References
17,420 citations
1,690 citations
"Turning on the Turbo: Fast Third-Or..." refers methods in this paper
...11 We trained by running 10 epochs of cost-augmented MIRA (Crammer et al., 2006)....
[...]
...To this end, we converted the Penn Treebank to dependencies through (i) the head rules of Yamada and Matsumoto (2003) (PTB-YM) and (ii) basic dependencies from the Stanford parser 2.0.5 (PTB-S).11 We trained by running 10 epochs of cost-augmented MIRA (Crammer et al., 2006)....
[...]
1,543 citations
1,011 citations
"Turning on the Turbo: Fast Third-Or..." refers methods in this paper
...3), we used 14 datasets, most of which are non-projective, from the CoNLL 2006 and 2008 shared tasks (Buchholz and Marsi, 2006; Surdeanu et al., 2008)....
[...]
980 citations
"Turning on the Turbo: Fast Third-Or..." refers background or methods in this paper
...We use an arc-factored score function (McDonald et al., 2005): f TREE(z) =∑L m=1 σARC(π(m),m), where π(m) is the parent of the mth word according to the parse tree z, and σARC(h,m) is the score of an individual arc....
[...]
...We use an arc-factored score function (McDonald et al., 2005): f (z) = ∑L m=1 σARC(π(m),m), where π(m) is the parent of the mth word according to the parse tree z, and σARC(h,m) is the score of an individual arc....
[...]
...Firstorder models factor over arcs (Eisner, 1996; McDonald et al., 2005), and second-order models include also consecutive siblings and grandparents (Carreras, 2007)....
[...]