Controlling diverse robots by inferring Jacobian fields with deep networks
Sizhe Lester Li (),
Annan Zhang,
Boyuan Chen,
Hanna Matusik,
Chao Liu,
Daniela Rus and
Vincent Sitzmann ()
Additional contact information
Sizhe Lester Li: Massachusetts Institute of Technology
Annan Zhang: Massachusetts Institute of Technology
Boyuan Chen: Massachusetts Institute of Technology
Hanna Matusik: Massachusetts Institute of Technology
Chao Liu: Massachusetts Institute of Technology
Daniela Rus: Massachusetts Institute of Technology
Vincent Sitzmann: Massachusetts Institute of Technology
Nature, 2025, vol. 643, issue 8070, 89-95
Abstract:
Abstract Mirroring the complex structures and diverse functions of natural organisms is a long-standing challenge in robotics1–4. Modern fabrication techniques have greatly expanded the feasible hardware5–8, but using these systems requires control software to translate the desired motions into actuator commands. Conventional robots can easily be modelled as rigid links connected by joints, but it remains an open challenge to model and control biologically inspired robots that are often soft or made of several materials, lack sensing capabilities and may change their material properties with use9–12. Here, we introduce a method that uses deep neural networks to map a video stream of a robot to its visuomotor Jacobian field (the sensitivity of all 3D points to the robot’s actuators). Our method enables the control of robots from only a single camera, makes no assumptions about the robots’ materials, actuation or sensing, and is trained without expert intervention by observing the execution of random commands. We demonstrate our method on a diverse set of robot manipulators that vary in actuation, materials, fabrication and cost. Our approach achieves accurate closed-loop control and recovers the causal dynamic structure of each robot. Because it enables robot control using a generic camera as the only sensor, we anticipate that our work will broaden the design space of robotic systems and serve as a starting point for lowering the barrier to robotic automation.
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.nature.com/articles/s41586-025-09170-0 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:nature:v:643:y:2025:i:8070:d:10.1038_s41586-025-09170-0
Ordering information: This journal article can be ordered from
https://www.nature.com/
DOI: 10.1038/s41586-025-09170-0
Access Statistics for this article
Nature is currently edited by Magdalena Skipper
More articles in Nature from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().