激光雷达
计算机科学
校准
人工智能
姿势
计算机视觉
点云
标准差
传感器融合
数学
遥感
统计
地质学
作者
You-Jeong Choi,Ju H. Park,Ho-Youl Jung
标识
DOI:10.1109/tim.2023.3336440
摘要
Light detection and ranging (LiDAR) and cameras are core sensors used in autonomous vehicles and industrial robots. LiDAR-camera fusion systems require an accurate estimation of the relative pose to integrate different sensor data. We propose an offline method for 3-D LiDAR-camera extrinsic calibration using an orthogonal trihedron with checkered patterns on each plane. Our approach for LiDAR pose estimation consists of four steps: background rejection, perpendicularity enforcement, dominant pose decision, and refinement. In the iterations of the first and second steps, several poses are sampled. The sample poses are evaluated and augmented, then the highest scoring sample is determined as the dominant pose. For the refinement, a new loss function with adaptive weights is introduced, which is formulated as the minimization of the sum of the squared distance between points and the nearest plane on the target. The relative pose is estimated by solving the perspective-n-point (PnP) problem. Our experimental results through simulations in various noise scenarios show that the proposed method estimates the relative poses with higher accuracy and stability compared to existing methods, in terms of the mean and standard deviation of errors. The source code is available at https://github.com/ygchoi11/3DLiDAR-Camera_Calibration .
科研通智能强力驱动
Strongly Powered by AbleSci AI