机器人
计算机科学
机器人学
人工智能
执行机构
控制工程
雅可比矩阵与行列式
领域(数学)
自动化
人工神经网络
机器人控制
软件
移动机器人
工程类
机械工程
数学
程序设计语言
应用数学
纯数学
作者
Sizhe Li,Annan Zhang,Boyuan Chen,Hanna Matusik,Chao Liu,Daniela Rus,Vincent Sitzmann
出处
期刊:Nature
[Springer Nature]
日期:2025-06-25
标识
DOI:10.1038/s41586-025-09170-0
摘要
Abstract Mirroring the complex structures and diverse functions of natural organisms is a long-standing challenge in robotics 1–4 . Modern fabrication techniques have greatly expanded the feasible hardware 5–8 , but using these systems requires control software to translate the desired motions into actuator commands. Conventional robots can easily be modelled as rigid links connected by joints, but it remains an open challenge to model and control biologically inspired robots that are often soft or made of several materials, lack sensing capabilities and may change their material properties with use 9–12 . Here, we introduce a method that uses deep neural networks to map a video stream of a robot to its visuomotor Jacobian field (the sensitivity of all 3D points to the robot’s actuators). Our method enables the control of robots from only a single camera, makes no assumptions about the robots’ materials, actuation or sensing, and is trained without expert intervention by observing the execution of random commands. We demonstrate our method on a diverse set of robot manipulators that vary in actuation, materials, fabrication and cost. Our approach achieves accurate closed-loop control and recovers the causal dynamic structure of each robot. Because it enables robot control using a generic camera as the only sensor, we anticipate that our work will broaden the design space of robotic systems and serve as a starting point for lowering the barrier to robotic automation.
科研通智能强力驱动
Strongly Powered by AbleSci AI