机器人
本体感觉
计算机科学
人机交互
软机器人
人工智能
物理
物理医学与康复
医学
作者
Peiyi Wang,Zhexin Xie,Wenci Xin,Zhi Qiang Tang,Xinhua Yang,Muralidharan Mohanakrishnan,Sheng Guo,Cecilia Laschi
标识
DOI:10.1038/s41467-024-54327-6
摘要
A high-level perceptual model found in the human brain is essential to guide robotic control when facing perception-intensive interactive tasks. Soft robots with inherent softness may benefit from such mechanisms when interacting with their surroundings. Here, we propose an expected-actual perception-action loop and demonstrate the model on a sensorized soft continuum robot. By sensing and matching expected and actual shape (1.4% estimation error on average), at each perception loop, our robot system rapidly (detection within 0.4 s) and robustly detects contact and distinguishes deformation sources, whether external and internal actions are applied separately or simultaneously. We also show that our soft arm can accurately perceive contact direction in both static and dynamic configurations (error below 10°), even in interactive environments without vision. The potential of our method are demonstrated in two experimental scenarios: learning to autonomously navigate by touching the walls, and teaching and repeating desired configurations of position and force through interaction with human operators. The authors propose an expected-actual perception-action loop in soft robots to rapidly and robustly detect contact and distinguish deformation sources. Soft robots enable intelligence to explore and learn through interacting with their surroundings.
科研通智能强力驱动
Strongly Powered by AbleSci AI