分歧(语言学)
可靠性
登普斯特-沙弗理论
计算机科学
信息融合
人工智能
Kullback-Leibler散度
传感器融合
数据挖掘
认识论
哲学
语言学
作者
Yin Zhu,Xiaojian Ma,Hang Wang
标识
DOI:10.1587/transinf.2023edp7102
摘要
Highly conflicting evidence that may lead to the counter-intuitive results is one of the challenges for information fusion in Dempster-Shafer evidence theory. To deal with this issue, evidence conflict is investigated based on belief divergence measuring the discrepancy between evidence. In this paper, the pignistic probability transform belief χ2 divergence, named as BBχ2 divergence, is proposed. By introducing the pignistic probability transform, the proposed BBχ2 divergence can accurately quantify the difference between evidence with the consideration of multi-element sets. Compared with a few belief divergences, the novel divergence has more precision. Based on this advantageous divergence, a new multi-source information fusion method is devised. The proposed method considers both credibility weights and information volume weights to determine the overall weight of each evidence. Eventually, the proposed method is applied in target recognition and fault diagnosis, in which comparative analysis indicates that the proposed method can realize the highest accuracy for managing evidence conflict.
科研通智能强力驱动
Strongly Powered by AbleSci AI