Publications Repository - Helmholtz-Zentrum Dresden-Rossendorf
1 PublicationNAM: Normalization-based Attention Module
Liu, Y.; Shao, Z.; Teng, Y.; Hoffmann, N.
Abstract
Recognizing less salient features is the key for model compression. However, it has not been investigated in the revolutionary attention mechanisms. In this work, we propose a novel normalization-based attention module (NAM), which suppresses less salient weights. It applies a weight sparsity penalty to the attention modules, thus, making them more computational efficient while retaining similar performance. A comparison with three other attention mechanisms on both Resnet and Mobilenet indicates that our method results in higher accuracy. Code for this paper can be publicly accessed at https://github.com/Christian-lyc/NAM.
Keywords: Attention; Normalization; ResNet; MobileNet; Deep Learning; ImageNet; CIFAR-100
-
Contribution to proceedings
35th Conference on Neural Information Processing Systems (NeurIPS 2021), Sydney, Australia., 14.12.2021, Sydney, Australia
Downloads
Permalink: https://www.hzdr.de/publications/Publ-33599
Years: 2023 2022 2021 2020 2019 2018 2017 2016 2015