LostNet: A smart way for lost and find
Meihua Zhou,
Ivan Fung,
Li Yang,
Nan Wan,
Keke Di and
Tingting Wang
PLOS ONE, 2024, vol. 19, issue 10, 1-17
Abstract:
The rapid population growth in urban areas has led to an increased frequency of lost and unclaimed items in public spaces such as public transportation, restaurants, and other venues. Services like Find My iPhone efficiently track lost electronic devices, but many valuable items remain unmonitored, resulting in delays in reclaiming lost and found items. This research presents a method to streamline the search process by comparing images of lost and recovered items provided by owners with photos taken when items are registered as lost and found. A photo matching network is proposed, integrating the transfer learning capabilities of MobileNetV2 with the Convolutional Block Attention Module (CBAM) and utilizing perceptual hashing algorithms for their simplicity and speed. An Internet framework based on the Spring Boot system supports the development of an online lost and found image identification system. The implementation achieves a testing accuracy of 96.8%, utilizing only 0.67 GFLOPs and 3.5M training parameters, thus enabling the recognition of images in real-world scenarios and operable on standard laptops.
Date: 2024
References: Add references at CitEc
Citations:
Downloads: (external link)
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0310998 (text/html)
https://journals.plos.org/plosone/article/file?id= ... 10998&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pone00:0310998
DOI: 10.1371/journal.pone.0310998
Access Statistics for this article
More articles in PLOS ONE from Public Library of Science
Bibliographic data for series maintained by plosone ().