EconPapers    
Economics at your fingertips  
 

A star-nose-like tactile-olfactory bionic sensing array for robust object recognition in non-visual environments

Mengwei Liu, Yujia Zhang, Jiachuang Wang, Nan Qin, Heng Yang, Ke Sun, Jie Hao, Lin Shu, Jiarui Liu, Qiang Chen, Pingping Zhang and Tiger H. Tao ()
Additional contact information
Mengwei Liu: State Key Laboratory of Transducer Technology, Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences
Yujia Zhang: State Key Laboratory of Transducer Technology, Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences
Jiachuang Wang: State Key Laboratory of Transducer Technology, Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences
Nan Qin: State Key Laboratory of Transducer Technology, Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences
Heng Yang: State Key Laboratory of Transducer Technology, Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences
Ke Sun: State Key Laboratory of Transducer Technology, Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences
Jie Hao: Institute of Automation, Chinese Academy of Sciences
Lin Shu: Institute of Automation, Chinese Academy of Sciences
Jiarui Liu: Institute of Automation, Chinese Academy of Sciences
Qiang Chen: Shanghai Fire Research Institute of MEM
Pingping Zhang: Suzhou Huiwen Nanotechnology Co., Ltd
Tiger H. Tao: State Key Laboratory of Transducer Technology, Shanghai Institute of Microsystem and Information Technology, Chinese Academy of Sciences

Nature Communications, 2022, vol. 13, issue 1, 1-10

Abstract: Abstract Object recognition is among the basic survival skills of human beings and other animals. To date, artificial intelligence (AI) assisted high-performance object recognition is primarily visual-based, empowered by the rapid development of sensing and computational capabilities. Here, we report a tactile-olfactory sensing array, which was inspired by the natural sense-fusion system of star-nose mole, and can permit real-time acquisition of the local topography, stiffness, and odor of a variety of objects without visual input. The tactile-olfactory information is processed by a bioinspired olfactory-tactile associated machine-learning algorithm, essentially mimicking the biological fusion procedures in the neural system of the star-nose mole. Aiming to achieve human identification during rescue missions in challenging environments such as dark or buried scenarios, our tactile-olfactory intelligent sensing system could classify 11 typical objects with an accuracy of 96.9% in a simulated rescue scenario at a fire department test site. The tactile-olfactory bionic sensing system required no visual input and showed superior tolerance to environmental interference, highlighting its great potential for robust object recognition in difficult environments where other methods fall short.

Date: 2022
References: Add references at CitEc
Citations: View citations in EconPapers (5)

Downloads: (external link)
https://www.nature.com/articles/s41467-021-27672-z Abstract (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-021-27672-z

Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/

DOI: 10.1038/s41467-021-27672-z

Access Statistics for this article

Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie

More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-19
Handle: RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-021-27672-z