[2002.03079] Blank Language Modelscontact arXivarXiv Twitter

We propose Blank Language Model (BLM), a model that generates sequences by dynamically creating and filling in blanks. Unlike previous masked language models or the Insertion Transformer, BLM uses blanks to control which part of the sequence to expand. This fine-grained control of generation is ideal for a variety of text editing and rewriting tasks. The model can start from a single blank or partially completed text with blanks at specified locations. It iteratively determines which word to place in a blank and whether to insert new blanks, and stops generating when no blanks are left to fill. BLM can be efficiently trained using a lower bound of the marginal data likelihood, and achieves perplexity comparable to traditional left-to-right language models on the Penn Treebank and WikiText datasets. On the task of filling missing text snippets, BLM significantly outperforms all other baselines in terms of both accuracy and fluency. Experiments on style transfer and damaged ancient text

3 mentions: @skitaoka_
Date: 2020/02/13 14:21

Related Entries

Read more GitHub - neulab/lrlm: Code for the paper "Latent Relation Language Models" at AAAI-20.
0 users, 2 mentions 2020/02/10 12:54
Read more 子どもの言語獲得のモデル化とNN Language ModelsNN
0 users, 0 mentions 2018/10/05 03:23
Read more Generalized Language Models
1 users, 24 mentions 2019/02/03 02:18
Read more GitHub - facebookresearch/XLM: PyTorch original implementation of Cross-lingual Language Model Pretr...
0 users, 4 mentions 2019/09/03 17:17
Read more GitHub - facebookresearch/XLM: PyTorch original implementation of Cross-lingual Language Model Pretr...
0 users, 10 mentions 2019/08/21 02:16