计算机视觉
人工智能
计算机科学
惯性测量装置
光流
运动补偿
事件(粒子物理)
图像扭曲
动态时间归整
特征(语言学)
图像(数学)
语言学
量子力学
物理
哲学
作者
Chunhui Zhao,Yakun Li,Yang Lyu
标识
DOI:10.1109/icra48891.2023.10160472
摘要
Accurate and timely onboard perception is a prerequisite for mobile robots to operate in highly dynamic scenarios. The bio-inspired event camera can capture more motion details than a traditional camera by triggering each pixel asynchronously and therefore is more suitable in such scenarios. Among various perception tasks based on the event camera, ego-motion removal is one fundamental procedure to reduce perception ambiguities. Recent ego-motion removal methods are mainly based on optimization processes and may be computationally expensive for robot applications. In this paper, we consider the challenging perception task of detecting fast-moving objects from an aggressively operated platform equipped with an event camera, achieving computational cost reduction by directly employing IMU motion measurement. First, we design a nonlinear warping function to capture rotation information from an IMU and to compensate for the camera motion during an asynchronous events stream. The proposed nonlinear warping function improves the compensation accuracy by 10%-15%. Afterward, we segmented the moving parts on the warped image through dynamic threshold segmentation and optical flow calculation, and clustering. Finally, we validate the proposed detection pipeline on public datasets and real-world data streams containing challenging light conditions and fast-moving objects.
科研通智能强力驱动
Strongly Powered by AbleSci AI