Recurrent models and lower bounds for projective syntactic decoding - ACL Anthology

Recurrent models and lower bounds for projective syntactic decoding - ACL Anthology

Natalie Schluter. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers). 2019.

3 mentions: @seb_ruder@glorisonne@_dmh
Keywords: acl
Date: 2019/10/11 12:50

Referring Tweets

@seb_ruder In conclusion, even neural networks simulating Eisner's algorithm still require cubic time. For more on this, check out her @NAACLHLT paper: t.co/0cxazjnhTJ
@glorisonne @natschluter giving us some theory to digest in her last talk before lunch: walking us through her proof that projective max spanning tree decoding cannot be solved significantly faster than in cubic time (O(n^3)); more details in #NAACL19 article t.co/pNBizKfQDu #EurNLP t.co/NbErOO6Kxm
@_dmh @natschluter Spoiler: you can do the simulation. This talk was extremely cool and Schluter's #NAACL2019 paper is now on my reading list: t.co/UMOoKLQmAf

Related Entries

Read more ACL 2017参加報告
0 users, 0 mentions 2018/07/08 06:23
Read more EACL2017 - Accepted Papers
0 users, 0 mentions 2018/04/22 03:40
Read more ACL2018の歩き方
0 users, 0 mentions 2018/07/08 06:23
Read more GitHub - jiesutd/NCRFpp: NCRF++, an Open-source Neural Sequence Labeling Toolkit. It includes charac...
0 users, 0 mentions 2018/06/16 13:30
Read more NAACL ’19 Notes: Practical Insights for Natural Language Processing Applications — Part I
0 users, 11 mentions 2019/07/24 21:27