稳健性(进化)
计算机科学
初始化
计算机视觉
人工智能
同时定位和映射
传感器融合
惯性测量装置
保险丝(电气)
惯性参考系
过程(计算)
实时计算
移动机器人
机器人
工程类
基因
操作系统
电气工程
物理
量子力学
生物化学
化学
程序设计语言
作者
Bo Yang,Jun Li,Hong Zhang
标识
DOI:10.1109/tim.2021.3101322
摘要
In this article, a resilient tightly coupled ultra-wideband (UWB) visual–inertial indoor localization system (R-UVIS) is developed to obtain accurate and robust localization performance in complex scenes, even in the case when sensors fail. More specifically, three schemes are designed for the proposed system. First, we introduce the line and image patch features to improve the precision and robustness of the visual features. Besides, we propose accurate loop closure and relocalization methods based on multifeatures to improve the performance of the localization system. Second, we introduce the UWB sensor into the system to suppress the localization drifts in the complex scenes and provide a fixed reference frame. Third, we propose a resilient multisensor fusion method based on an optimization framework to fuse the UWB, visual and inertial measurements in a tightly coupled manner. This data fusion approach improves the robustness of the system, making the localization system seamlessly switch among different localization modes depending on the specific scenes. In addition, an initialization process along with three sensors is also designed for the whole system. We conduct extensive experiments on the public dataset and in real-world scenarios to evaluate the proposed R-UVIS. The experimental results show that the proposed R-UVIS can provide the accurate and robust localization results in a fixed coordinate system for the complicated indoor scenes, even in the case when visual tracking fails or the UWB anchors are unavailable.
科研通智能强力驱动
Strongly Powered by AbleSci AI