同时定位和映射
人工智能
里程计
水准点(测量)
计算机科学
机器人学
视觉里程计
计算机视觉
全球定位系统
机器人
移动机器人
地理
地图学
电信
作者
Jose Cuaran,Andrés Eduardo Baquero Velasquez,Mateus Valverde Gasparino,Naveen Kumar Uppalapati,Arun Narenthiran Sivakumar,Justin Wasserman,Muhammad Huzaifa,Sarita V. Adve,Girish Chowdhary
标识
DOI:10.1177/02783649231215372
摘要
Simultaneous localization and mapping (SLAM) has been an active research problem over recent decades. Many leading solutions are available that can achieve remarkable performance in environments with familiar structure, such as indoors and cities. However, our work shows that these leading systems fail in an agricultural setting, particularly in under the canopy navigation in the largest-in-acreage crops of the world: corn ( Zea mays) and soybean ( Glycine max). The presence of plenty of visual clutter due to leaves, varying illumination, and stark visual similarity makes these environments lose the familiar structure on which SLAM algorithms rely on. To advance SLAM in such unstructured agricultural environments, we present a comprehensive agricultural dataset. Our open dataset consists of stereo images, IMUs, wheel encoders, and GPS measurements continuously recorded from a mobile robot in corn and soybean fields across different growth stages. In addition, we present best-case benchmark results for several leading visual-inertial odometry and SLAM systems. Our data and benchmark clearly show that there is significant research promise in SLAM for agricultural settings. The dataset is available online at: https://github.com/jrcuaranv/terrasentia-dataset .
科研通智能强力驱动
Strongly Powered by AbleSci AI