EconPapers    
Economics at your fingertips  
 

Assessment of Three Automated Identification Methods for Ground Object Based on UAV Imagery

Ke Zhang, Sarvesh Maskey, Hiromu Okazawa (), Kiichiro Hayashi (), Tamano Hayashi, Ayako Sekiyama, Sawahiko Shimada and Lameck Fiwa
Additional contact information
Ke Zhang: Faculty of Agriculture, Graduate School of Tokyo University of Agriculture, Tokyo 156-8502, Japan
Sarvesh Maskey: Faculty of Regional Environment Science, Tokyo University of Agriculture, Tokyo 156-8502, Japan
Hiromu Okazawa: Faculty of Regional Environment Science, Tokyo University of Agriculture, Tokyo 156-8502, Japan
Kiichiro Hayashi: Institute of Material and Systems for Sustainable, Nagoya University, Nagoya 464-8601, Japan
Tamano Hayashi: Faculty of Advanced Science and Technology of Regional Environment Science, Ryukoku University, Kyoto 612-8577, Japan
Ayako Sekiyama: Faculty of Regional Environment Science, Tokyo University of Agriculture, Tokyo 156-8502, Japan
Sawahiko Shimada: Faculty of Regional Environment Science, Tokyo University of Agriculture, Tokyo 156-8502, Japan
Lameck Fiwa: Faculty of Agriculture, Lilongwe University of Agriculture and Natural Resources, Lilongwe P.O. Box 219, Malawi

Sustainability, 2022, vol. 14, issue 21, 1-19

Abstract: Identification and monitoring of diverse resources or wastes on the ground is important for integrated resource management. The unmanned aerial vehicle (UAV), with its high resolution and facility, is the optimal tool for monitoring ground objects accurately and efficiently. However, previous studies have focused on applying classification methodology on land use and agronomy, and few studies have compared different classification methods using UAV imagery. It is necessary to fully utilize the high resolution of UAV by applying the classification methodology to ground object identification. This study compared three classification methods: A. NDVI threshold, B. RGB image-based machine learning, and C. object-based image analysis (OBIA). Method A was the least time-consuming and could identify vegetation and soil with high accuracy (user’s accuracy > 0.80), but had poor performance at classifying dead vegetation, plastic, and metal (user’s accuracy < 0.50). Both Methods B and C were time- and labor-consuming, but had very high accuracy in separating vegetation, soil, plastic, and metal (user’s accuracy ≥ 0.70 for all classes). Method B showed a good performance in identifying objects with bright colors, whereas Method C showed a high ability in separating objects with similar visual appearances. Scientifically, this study has verified the possibility of using the existing classification methods on identifying small ground objects with a size of less than 1 m, and has discussed the reasons for the different accuracy of the three methods. Practically, these results help users from different fields to choose an appropriate method that suits their target, so that different wastes or multiple resources can be monitored at the same time by combining different methods, which contributes to an improved integrated resource management system.

Keywords: UAV; NDVI; orthomosaic; classification; OBIA; machine learning; threshold (search for similar items in EconPapers)
JEL-codes: O13 Q Q0 Q2 Q3 Q5 Q56 (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2071-1050/14/21/14603/pdf (application/pdf)
https://www.mdpi.com/2071-1050/14/21/14603/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jsusta:v:14:y:2022:i:21:p:14603-:d:965189

Access Statistics for this article

Sustainability is currently edited by Ms. Alexandra Wu

More articles in Sustainability from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jsusta:v:14:y:2022:i:21:p:14603-:d:965189