deep learning
paper review: “CBAM: Convolutional Block Attention Module”
arxiv: https://arxiv.org/pdf/1807.06521.pdf
key point
trying to attach attention module to CNN. but instead of blindly attaching it which would compute a 3D attention map which is computationally expensive, this work proposes to compute spatial attention and channel attention separately which achieves a similar effect with much less parameters.
(more…)