GitHub - BangguWu/ECANet: Code for ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks

GitHub - BangguWu/ECANet: Code for ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks

Code for ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks - BangguWu/ECANet

1 mentions: @dakuton
Keywords: attention
Date: 2019/11/21 15:49

Referring Tweets

@dakuton SE blockとGlobal Average Pooling版Channel attention blockの組み合わせ利用について。GAPのkernel数をchannelサイズに応じて調整学習 ECA-Net: Efficient Channel Attention t.co/kDMwKfkSfJ

Related Entries

Read more [MIRU2018] Global Average Poolingの特性を用いたAttention Branch Network
2 users, 0 mentions 2018/08/10 09:23
Read more GitHub - harvardnlp/seq2seq-attn: Sequence-to-sequence model with LSTM encoder/decoders and attentio...
0 users, 0 mentions 2018/04/22 03:40
Read more [1710.10903] Graph Attention Networks
1 users, 1 mentions 2019/10/15 03:48
Read more [1906.01861] GRAM: Scalable Generative Models for Graphs with Graph Attention Mechanism
0 users, 1 mentions 2019/07/16 15:46
Read more [1808.08946] Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures
0 users, 0 mentions 2018/09/07 09:23