计算机科学
人工智能
钥匙(锁)
独创性
数据科学
深度学习
工作(物理)
追踪
传感器融合
同时定位和映射
开发(拓扑)
管理科学
人机交互
开放式研究
工程类
重大挑战
目视进近
作者
Lebin Zhao,Tao Chen,Pei-Pei Yuan,Xiaoyang Li,Bin Chen
出处
期刊:Industrial Robot-an International Journal
[Emerald Publishing Limited]
日期:2025-09-22
标识
DOI:10.1108/ir-04-2025-0137
摘要
Purpose This study aims to enhance the understanding of the current research status, challenges and potential development directions of deep learning (DL)-based visual simultaneous localization and mapping (VSLAM), thereby laying the groundwork for its applications in autonomous navigation, intelligent driving and other related domains. Design/methodology/approach This study comprehensively assesses recent advances and future challenges in DL-based VSLAM and visual-inertial SLAM (VISLAM). It first introduces existing review studies and clarifies its unique positioning. Subsequently, it thoroughly discusses the key contributions, strengths and limitations of the V(I)SLAM methods from three perspectives: supervised learning, unsupervised learning and hybrid approaches combining classical and learning-based methods. It also includes a targeted survey of research on semantic SLAM focusing on dynamic scenes. Finally, potential development directions and challenges are proposed. Findings Hybrid learning methods demonstrate certain advantages in dynamic or visually degraded environments, possessing significant development potential. Exploring novel network architectures and fusion with other sensors are also crucial directions for VSLAM advancement. However, these efforts require support from multimodal, explainable and robustness-focused datasets alongside unified evaluation metrics. Originality/value To the best of the authors’ knowledge, the originality of this work lies in its systematic summary and analysis of V(I)SLAM research based on a DL taxonomy, while methodically tracing the methodological evolution from classical static methods to dynamic semantic-aware paradigms. This paper further outlines future development trajectories, providing valuable references for researchers in related fields.
科研通智能强力驱动
Strongly Powered by AbleSci AI