无人机
计算机科学
计算机视觉
人工智能
事件(粒子物理)
船上
图像(数学)
融合
实时计算
语言学
哲学
物理
遗传学
量子力学
工程类
生物
航空航天工程
作者
Xinjun Cai,Jingao Xu,Kuntian Deng,Hongbo Lan,Yue Wu,Xiangwen Zhuge,Zheng Yang
摘要
Drones have witnessed extensive popularity among diverse smart applications, and visual SLAM technology is commonly used to estimate 6-DoF pose for the drone flight control system. However, traditional image-based SLAM cannot ensure the flight safety of drones, especially in challenging environments such as high-speed flight and high dynamic range scenarios. Event camera, a new vision sensor, holds the potential to enable drones to overcome the above challenging scenarios if fused into the image-based SLAM. Unfortunately, the computational demands of event-image fusion SLAM have grown manifold compared to image-based SLAM. Existing research on visual SLAM acceleration cannot achieve real-time operation of event-image fusion SLAM on on-board computing platforms for drones. To fill this gap, we present TrinitySLAM , a high accuracy, real-time, low energy consumption event-image fusion SLAM acceleration framework utilizing Xilinx Zynq, an on-board heterogeneous computing platform. The key innovations of TrinitySLAM include a fine-grained computation allocation strategy, several novel hardware-software co-acceleration designs, and an efficient data exchange mechanism. We fully implement TrinitySLAM on the latest Zynq UltraScale+ platform and evaluate its performance under one self-made drone dataset and four official datasets covering various scenarios. Comprehensive experiments show TrinitySLAM improves the pose estimation accuracy by 28% with half end-to-end latency and 1.2 × energy consumption reduction, compared to the most comparable SOTA heterogeneous computing platform acceleration baseline.
科研通智能强力驱动
Strongly Powered by AbleSci AI