[1905.13438] Content Word-based Sentence Decoding and Evaluating for Open-domain Neural Response Generation

Various encoder-decoder models have been applied to response generation in open-domain dialogs, but a majority of conventional models directly learn a mapping from lexical input to lexical output without explicitly modeling intermediate representations. Utilizing language hierarchy and modeling intermediate information have been shown to benefit many language understanding and generation tasks. Motivated by Broca's aphasia, we propose to use a content word sequence as an intermediate representation for open-domain response generation. Experimental results show that the proposed method improves content relatedness of produced responses, and our models can often choose correct grammar for generated content words. Meanwhile, instead of evaluating complete sentences, we propose to compute conventional metrics on content word sequences, which is a better indicator of content relevance.

Date: 2019/06/05 09:47

Related Entries

Read more GitHub - slundberg/shap: A unified approach to explain the output of any machine learning model.
0 users, 0 mentions 2018/06/27 10:28
Read more BERT, Transfer learning for dialogue, Deep Learning SOTA 2019, Gaussian Processes, VI, NLP lesson cu...
3 users, 10 mentions 2019/02/13 05:15
Read more Highlights of NAACL-HLT 2018: Generalization, Test-of-time, and Dialogue Systems - AYLIEN
0 users, 0 mentions 2018/06/12 19:19
Read more Train ML models on large images and 3D volumes with spatial partitioning on Cloud TPUs | Google Clou...
0 users, 14 mentions 2019/09/12 21:48
Read more CMU Neural Nets for NLP 2019 (19): Models of Dialog - YouTube
0 users, 0 mentions 2019/04/10 14:18