AI-embodied multi-modal flexible electronic robots with programmable sensing, actuating and self-learning
Junfeng Li,
Zhangyu Xu,
Nanpei Li,
Kaijun Zhang,
Guangyong Xiong,
Minjie Sun,
Chao Hou,
Jingjing Ji,
Fan Zhang (),
Junwen Zhong () and
YongAn Huang ()
Additional contact information
Junfeng Li: Wuhan University of Technology
Zhangyu Xu: Huazhong University of Science and Technology
Nanpei Li: Wuhan University of Technology
Kaijun Zhang: University of Macau
Guangyong Xiong: Wuhan University of Technology
Minjie Sun: Wuhan University of Technology
Chao Hou: Huazhong University of Science and Technology
Jingjing Ji: Huazhong University of Science and Technology
Fan Zhang: Huazhong University of Science and Technology
Junwen Zhong: University of Macau
YongAn Huang: Huazhong University of Science and Technology
Nature Communications, 2025, vol. 16, issue 1, 1-10
Abstract:
Abstract Achieving robust environmental interaction in small-scale soft robotics remains challenging due to limitations in terrain adaptability, real-time perception, and autonomous decision-making. Here, we introduce Flexible Electronic Robots constructed from programmable flexible electronic components and setae modules. The integrated platform combines multimodal sensing/actuation with embedded computing, enabling adaptive operation in diverse environments. Applying modular design principles to configure structural topologies, actuation sequences, and circuit layouts, these robots achieve multimodal locomotion, including vertical surface traversal, directional control, and obstacle navigation. The system implements proprioception (shape and attitude) and exteroception (vision, temperature, humidity, proximity and pathway shape recognition) under dynamic conditions. Onboard computational units enable autonomous behaviors like hazard evasion and thermal gradient tracking through adaptive decision-making, supported by embodied artificial intelligence. In this work, we establish a framework for creating small-scale soft robots with enhanced environmental intelligence through tightly integrated sensing, actuation, and decision-making architectures.
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.nature.com/articles/s41467-025-63881-6 Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-63881-6
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/s41467-025-63881-6
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().