[1906.03731] Is Attention Interpretable?

Attention mechanisms have recently boosted performance on a range of NLP tasks. Because attention layers explicitly weight input components' representations, it is also often assumed that attention can be used to identify information that models found important (e.g., specific contextualized word tokens). We test whether that assumption holds by manipulating attention weights in already-trained text classification models and analyzing the resulting differences in their predictions. While we observe some ways in which higher attention weights correlate with greater impact on model predictions, we also find many ways in which this does not hold, i.e., where gradient-based rankings of attention weights better predict their effects than their magnitudes. We conclude that while attention noisily predicts input components' overall importance to a model, it is by no means a fail-safe indicator.

2 mentions: @nlpnoah@soldni
Keywords: attention
Date: 2019/06/11 14:17

Referring Tweets

@nlpnoah new work by @uwnlp PhD student Sofia Serrano (remember that name!) and @nlpnoah on whether attention does what you think it does, to appear at ACL https://t.co/YZkv4SVXMN
@soldni “Is Attention Interpretable?” by Sofia Serrano and @nlpnoah https://t.co/68AoBHwERb — neat work that analyses how attention weights impact model predictions. Their findings suggest that they kinda do, but much less than one would think.

Related Entries

Read more [MIRU2018] Global Average Poolingの特性を用いたAttention Branch Network
2 users, 0 mentions 2018/08/10 09:23
Read more GitHub - harvardnlp/seq2seq-attn: Sequence-to-sequence model with LSTM encoder/decoders and attentio...
0 users, 0 mentions 2018/04/22 03:40
Read more [1906.01861] GRAM: Scalable Generative Models for Graphs with Graph Attention Mechanism
0 users, 1 mentions 2019/07/16 15:46
Read more [1808.08946] Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures
0 users, 0 mentions 2018/09/07 09:23
Read more [1706.03762] Attention Is All You Need
0 users, 0 mentions 2018/04/22 03:41