修剪
计算机科学
聚类分析
烧蚀
人工智能
可视化
集合(抽象数据类型)
树(集合论)
模式识别(心理学)
班级(哲学)
深度学习
构造(python库)
卷积神经网络
机器学习
数学
数学分析
工程类
农学
生物
程序设计语言
航空航天工程
作者
A. A. Salama,Noha Adly,Marwan Torki
标识
DOI:10.1109/icip46576.2022.9897617
摘要
Recently, providing explainable deep learning models has sparked a lot of attention. In this paper, we take a further step in this direction. We introduce a time-efficient method, called Ablation-CAM++, which can generate smooth visual explanations of CNN model predictions. Our approach uses the concept of studying the ablation analysis to determine the importance of activation maps w.r.t. the target class, similar to Ablation-CAM. However, instead of focusing on the individual importance of each activation map, we group activation maps using a clustering technique. Then, we construct a binary tree for each group by recursively splitting these groups, studying the ablation of each subgroup, and applying tree pruning. We perform qualitative and quantitative evaluations of our visual explanations against Ablation-CAM and Grad-CAM. Our approach can provide visual explanations in less than half of the time of Ablation-CAM. Using average drop and average increase evaluation metrics on 2000 images of the ImageNet validation set, we provide a comparison of the effect of applying different clustering techniques in our method.
科研通智能强力驱动
Strongly Powered by AbleSci AI