[1910.14296] LIMIT-BERT : Linguistic Informed Multi-Task BERT

In this paper, we present a Linguistic Informed Multi-Task BERT (LIMIT-BERT) for learning language representations across multiple linguistic tasks by Multi-Task Learning (MTL). LIMIT-BERT includes five key linguistic syntax and semantics tasks: Part-Of-Speech (POS) tags, constituent and dependency syntactic parsing, span and dependency semantic role labeling (SRL). Besides, LIMIT-BERT adopts linguistics mask strategy: Syntactic and Semantic Phrase Masking which mask all of the tokens corresponding to a syntactic/semantic phrase. Different from recent Multi-Task Deep Neural Networks (MT-DNN) (Liu et al., 2019), our LIMIT-BERT is linguistically motivated and learning in a semi-supervised method which provides large amounts of linguistic-task data as same as BERT learning corpus. As a result, LIMIT-BERT not only improves linguistic tasks performance but also benefits from a regularization effect and linguistic information that leads to more general representations to help adapt to new ta

Keywords: bert
Date: 2019/11/07 15:50

Related Entries

Read more GitHub - soskek/bert-chainer: Chainer implementation of "BERT: Pre-training of Deep Bidirectional Tr...
7 users, 0 mentions 2018/12/02 18:01
Read more [DL Hacks]BERT: Pre-training of Deep Bidirectional Transformers for L…
4 users, 5 mentions 2018/12/07 04:31
Read more [DL輪読会]BERT: Pre-training of Deep Bidirectional Transformers for Lang…
0 users, 0 mentions 2018/10/20 12:15
Read more GitHub - huggingface/pytorch-pretrained-BERT: The Big-&-Extending-Repository-of-Transformers: PyTorc...
1 users, 7 mentions 2019/03/04 21:47
Read more tensorflow2でhuggigfaceのtransformersを使ってBERTを文書分類モデルに転移学習する - メモ帳
0 users, 1 mentions 2019/10/22 12:50