计算机科学
不确定度量化
跳跃式监视
目标检测
推论
测量不确定度
块(置换群论)
高斯过程
最小边界框
对象(语法)
过程(计算)
编码(集合论)
数据挖掘
探测器
高斯分布
人工智能
算法
机器学习
图像(数学)
模式识别(心理学)
数学
统计
物理
几何学
集合(抽象数据类型)
量子力学
程序设计语言
操作系统
电信
作者
Sanbao Su,Yiming Li,Sihong He,Songyang Han,Chen Feng,Caiwen Ding,Fei Miao
标识
DOI:10.1109/icra48891.2023.10160367
摘要
Sharing information between connected and autonomous vehicles (CAVs) fundamentally improves the performance of collaborative object detection for self-driving. However, CAVs still have uncertainties on object detection due to practical challenges, which will affect the later modules in self-driving such as planning and control. Hence, uncertainty quantification is crucial for safety-critical systems such as CAVs. Our work is the first to estimate the uncertainty of collaborative object detection. We propose a novel uncertainty quantification method, called Double- M Quantification, which tailors a moving block bootstrap (MBB) algorithm with direct modeling of the multivariant Gaussian distribution of each corner of the bounding box. Our method captures both the epistemic uncertainty and aleatoric uncertainty with one inference pass based on the offline Double- M training process. And it can be used with different collaborative object detectors. Through experiments on the comprehensive collaborative perception dataset, we show that our Double-M method achieves more than 4× improvement on uncertainty score and more than 3% accuracy improvement, compared with the state-of-the-art uncertainty quantification methods. Our code is public on https://coperception.github.io/double-m-quantification/.
科研通智能强力驱动
Strongly Powered by AbleSci AI