EconPapers    
Economics at your fingertips  
 

A robotic sensory system with high spatiotemporal resolution for texture recognition

Ningning Bai, Yiheng Xue, Shuiqing Chen, Lin Shi, Junli Shi, Yuan Zhang, Xingyu Hou, Yu Cheng, Kaixi Huang, Weidong Wang, Jin Zhang, Yuan Liu and Chuan Fei Guo ()
Additional contact information
Ningning Bai: Southern University of Science and Technology
Yiheng Xue: Southern University of Science and Technology
Shuiqing Chen: Southern University of Science and Technology
Lin Shi: Southern University of Science and Technology
Junli Shi: Southern University of Science and Technology
Yuan Zhang: Southern University of Science and Technology
Xingyu Hou: Southern University of Science and Technology
Yu Cheng: Southern University of Science and Technology
Kaixi Huang: Southern University of Science and Technology
Weidong Wang: Xidian University
Jin Zhang: Southern University of Science and Technology
Yuan Liu: University of Houston
Chuan Fei Guo: Southern University of Science and Technology

Nature Communications, 2023, vol. 14, issue 1, 1-11

Abstract: Abstract Humans can gently slide a finger on the surface of an object and identify it by capturing both static pressure and high-frequency vibrations. Although modern robots integrated with flexible sensors can precisely detect pressure, shear force, and strain, they still perform insufficiently or require multi-sensors to respond to both static and high-frequency physical stimuli during the interaction. Here, we report a real-time artificial sensory system for high-accuracy texture recognition based on a single iontronic slip-sensor, and propose a criterion—spatiotemporal resolution, to corelate the sensing performance with recognition capability. The sensor can respond to both static and dynamic stimuli (0-400 Hz) with a high spatial resolution of 15 μm in spacing and 6 μm in height, together with a high-frequency resolution of 0.02 Hz at 400 Hz, enabling high-precision discrimination of fine surface features. The sensory system integrated on a prosthetic fingertip can identify 20 different commercial textiles with a 100.0% accuracy at a fixed sliding rate and a 98.9% accuracy at random sliding rates. The sensory system is expected to help achieve subtle tactile sensation for robotics and prosthetics, and further be applied to haptic-based virtual reality and beyond.

Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
https://www.nature.com/articles/s41467-023-42722-4 Abstract (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:14:y:2023:i:1:d:10.1038_s41467-023-42722-4

Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/

DOI: 10.1038/s41467-023-42722-4

Access Statistics for this article

Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie

More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-19
Handle: RePEc:nat:natcom:v:14:y:2023:i:1:d:10.1038_s41467-023-42722-4