[1909.03402] Squeeze-and-Attention Networks for Semantic Segmentation

Squeeze-and-excitation (SE) module enhances the representational power of convolution layers by adaptively re-calibrating channel-wise feature responses. However, the limitation of SE in terms of attention characterization lies in the loss of spatial information cues, making it less well suited for perception tasks with very high spatial inter-dependencies such as semantic segmentation. In this paper, we propose a novel squeeze-and-attention network (SANet) architecture that leverages a simple but effective squeeze-and-attention (SA) module to account for two distinctive characteristics of segmentation: i) pixel-group attention, and ii) pixel-wise prediction. Specifically, the proposed SA modules impose pixel-group attention on conventional convolution by introducing an 'attention' convolutional channel, thus taking into account spatial-channel inter-dependencies in an efficient manner. The final segmentation results are produced by merging outputs from four hierarchical stages of a SA

2 mentions: @ZFPhalanx
Keywords: attention
Date: 2019/09/11 03:48

Related Entries

Read more [MIRU2018] Global Average Poolingの特性を用いたAttention Branch Network
2 users, 0 mentions 2018/08/10 09:23
Read more GitHub - harvardnlp/seq2seq-attn: Sequence-to-sequence model with LSTM encoder/decoders and attentio...
0 users, 0 mentions 2018/04/22 03:40
Read more [1710.10903] Graph Attention Networks
1 users, 1 mentions 2019/10/15 03:48
Read more [1906.01861] GRAM: Scalable Generative Models for Graphs with Graph Attention Mechanism
0 users, 1 mentions 2019/07/16 15:46
Read more [1808.08946] Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures
0 users, 0 mentions 2018/09/07 09:23