CBAM: Convolutional Block Attention Module

您所在的位置:网站首页 vgg16论文怎么引用参考文献 CBAM: Convolutional Block Attention Module

CBAM: Convolutional Block Attention Module

2024-07-17 14:53| 来源: 网络整理| 查看: 265

来自 Semantic Scholar  喜欢 1

阅读量:

33261

作者:

S Woo,J Park,JY Lee,IS Kweon

展开

摘要:

We propose Convolutional Block Attention Module (CBAM), a simple yet effective attention module for feed-forward convolutional neural networks. Given an intermediate feature map, our module sequentially infers attention maps along two separate dimensions, channel and spatial, then the attention maps are multiplied to the input feature map for adaptive feature refinement. Because CBAM is a lightweight and general module, it can be integrated into any CNN architectures seamlessly with negligible overheads and is end-to-end trainable along with base CNNs. We validate our CBAM through extensive experiments on ImageNet-1K, MS~COCO detection, and VOC~2007 detection datasets. Our experiments show consistent improvements in classification and detection performances with various models, demonstrating the wide applicability of CBAM. The code and models will be publicly available.

展开

关键词:

Computer Science - Computer Vision and Pattern Recognition

DOI:

10.1007/978-3-030-01234-2_1

被引量:

38

年份:

2018



【本文地址】


今日新闻


推荐新闻


    CopyRight 2018-2019 办公设备维修网 版权所有 豫ICP备15022753号-3