激光雷达
点云
计算机视觉
人工智能
计算机科学
迭代最近点
同时定位和映射
姿势
里程计
遥感
聚类分析
移动机器人
地理
机器人
作者
Jie Qian,Kaiqi Chen,Qinying Chen,Yanhong Yang,Jianhua Zhang,Shengyong Chen
标识
DOI:10.1109/lgrs.2021.3099166
摘要
Obtaining 3-D data by LIDAR from unmanned aerial vehicles (UAVs) is vital for the field of remote sensing; however, the highly dynamic movement of UAVs and narrow viewpoint of LIDAR pose a great challenge to the self-localization for UAVs based on solely LIDAR sensor. To this end, we propose a robust simultaneous localization and mapping (SLAM) system, which combines the image data obtained by vision sensor and point clouds obtained by LIDAR. In the front-end of the proposed system, the more stable line and plane features are extracted from point clouds through clustering. Then the relative pose between two consecutive frames is computed by the least squares iterative closest point algorithm. Afterward, a novel direct odometry algorithm is developed by combining the image frames and sparse point clouds, where the relative pose is used as a prior. In the back-end, the pose estimation is refined and the 3-D map with texture information is built at a lower frequency. Extensive experiments show that our method can achieve robust and highly precise localization and mapping for UAVs.
科研通智能强力驱动
Strongly Powered by AbleSci AI