[2004.12681v1] Lexically Constrained Neural Machine Translation with Levenshtein Transformeropen searchopen navigation menucontact arXivarXiv Twitter

This paper proposes a simple and effective algorithm for incorporating lexical constraints in neural machine translation. Previous work either required re-training existing models with the lexical constraints or incorporating them during beam search decoding with significantly higher computational overheads. Leveraging the flexibility and speed of a recently proposed Levenshtein Transformer model (Gu et al., 2019), our method injects terminology constraints at inference time without any impact on decoding speed. Our method does not require any modification to the training procedure and can be easily applied at runtime with custom dictionaries. Experiments on English-German WMT datasets show that our approach improves an unconstrained baseline and previous approaches.

1 mentions: @yoh_okuno
Date: 2020/06/25 03:51

Referring Tweets

@yoh_okuno LevTで単語の制約を入れる話。ユーザー辞書や数字の保全に使える。 t.co/78ZzGk4bxj

Related Entries

Read more 言語処理学会第25回年次大会(NLP2019)
0 users, 4 mentions 2018/12/02 15:01
Read more NLP and Global Warming
0 users, 1 mentions 2019/05/17 18:48
Read more 自然言語処理(NLP)系のエンジニア・研究者のキャリアを語る会 #nlp_career - connpass
0 users, 28 mentions 2019/06/13 14:21
Read more 自然言語処理(NLP)系のエンジニア・研究者のキャリアを語る会 #nlp_career - connpass
0 users, 52 mentions 2019/06/20 09:47
Read more BEA 2019 Shared Task: Grammatical Error Correction
0 users, 1 mentions 2019/08/24 05:17