Class Activation Map Guided Backpropagation for Discriminative Explanations

The interpretability of neural networks has garnered significant attention. In the domain of computer vision, gradient-based feature attribution techniques like RectGrad have been proposed to utilize saliency maps to demonstrate feature contributions to predictions. Despite advancements, RectGrad fa...

Full description

Saved in:
Bibliographic Details
Main Authors: Yongjie Liu, Wei Guo, Xudong Lu, Lanju Kong, Zhongmin Yan
Format: Article
Language:English
Published: MDPI AG 2025-01-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/1/379
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The interpretability of neural networks has garnered significant attention. In the domain of computer vision, gradient-based feature attribution techniques like RectGrad have been proposed to utilize saliency maps to demonstrate feature contributions to predictions. Despite advancements, RectGrad falls short in category discrimination, producing similar saliency maps across categories. This paper pinpoints the ineffectiveness of threshold-based strategies in RectGrad for distinguishing feature gradients and introduces Class activation map Guided BackPropagation (CGBP) to tackle the issue. CGBP leverages class activation maps during backpropagation to enhance gradient selection, achieving consistent improvements across four models (VGG16, VGG19, ResNet50, and ResNet101) on ImageNet’s validation set. Notably, on VGG16, CGBP improves SIC, AIC, and IS scores by 10.3%, 11.5%, and 4.5%, respectively, compared to RectGrad while maintaining competitive DS performance. Moreover, CGBP demonstrates greater sensitivity to model parameter changes than RectGrad, as confirmed by a sanity check. The proposed method has broad applicability in scenarios like model debugging, where it identifies causes of misclassification, and medical image diagnosis, where it enhances user trust by aligning visual explanations with clinical insights.
ISSN:2076-3417