[1911.02365] Learning to Answer by Learning to Ask: Getting the Best of GPT-2 and BERT Worlds

Automatic question generation aims at the generation of questions from a context, with the corresponding answers being sub-spans of the given passage. Whereas, most of the methods mostly rely on heuristic rules to generate questions, more recently also neural network approaches have been proposed. In this work, we propose a variant of the self-attention Transformer network architectures model to generate meaningful and diverse questions. To this end, we propose an easy to use model consisting of the conjunction of the Transformer decoder GPT-2 model with Transformer encoder BERT for the downstream task for question answering. The model is trained in an end-to-end fashion, where the language model is trained to produce a question-answer-aware input representation that facilitates to generate an answer focused question. Our result of neural question generation from text on the SQuAD 1.1 dataset suggests that our method can produce semantically correct and diverse questions. Additionally,

4 mentions: @mfts0@pm_girl@helioRocha_
Keywords: bert
Date: 2019/11/07 11:20

Related Entries

Read more [1908.11782] Latent Part-of-Speech Sequences for Neural Machine Translation
0 users, 4 mentions 2019/09/03 14:17
Read more [1909.05527] Inspecting adversarial examples using the Fisher information
0 users, 3 mentions 2019/09/13 14:18
Read more [1909.07483] The Animal-AI Environment: Training and Testing Animal-Like Artificial Cognition
0 users, 4 mentions 2019/09/20 17:18
Read more [1910.03016] Is a Good Representation Sufficient for Sample Efficient Reinforcement Learning?
0 users, 4 mentions 2019/10/11 21:48
Read more [1910.07475] MLQA: Evaluating Cross-lingual Extractive Question Answering
0 users, 3 mentions 2019/10/17 02:19