EconPapers    
Economics at your fingertips  
 

Interchangeability of Cross-Platform Orthophotographic and LiDAR Data in DeepLabV3+-Based Land Cover Classification Method

Shijun Pan (), Keisuke Yoshida (), Satoshi Nishiyama, Takashi Kojima and Yutaro Hashimoto
Additional contact information
Shijun Pan: Graduate School of Environmental and Life Science, Okayama University, 2-1-1, Tsushima-Naka, Kita-ku, Okayama-shi 700-8530, Japan
Keisuke Yoshida: Graduate School of Environmental and Life Science, Okayama University, 2-1-1, Tsushima-Naka, Kita-ku, Okayama-shi 700-8530, Japan
Satoshi Nishiyama: Graduate School of Environmental and Life Science, Okayama University, 2-1-1, Tsushima-Naka, Kita-ku, Okayama-shi 700-8530, Japan
Takashi Kojima: TOKEN C. E. E. Consultants Co., Ltd., 1-36-1 Azuma-cho, Omiya-ku, Saitama-shi 330-0841, Japan
Yutaro Hashimoto: Graduate School of Environmental and Life Science, Okayama University, 2-1-1, Tsushima-Naka, Kita-ku, Okayama-shi 700-8530, Japan

Land, 2025, vol. 14, issue 2, 1-17

Abstract: Riverine environmental information includes important data to collect, and the data collection still requires personnel’s field surveys. These on-site tasks still face significant limitations (i.e., hard or danger to entry). In recent years, as one of the efficient approaches for data collection, air-vehicle-based Light Detection and Ranging technologies have already been applied in global environmental research, i.e., land cover classification (LCC) or environmental monitoring. For this study, the authors specifically focused on seven types of LCC (i.e., bamboo, tree, grass, bare ground, water, road, and clutter) that can be parameterized for flood simulation. A validated airborne LiDAR bathymetry system (ALB) and a UAV-borne green LiDAR System (GLS) were applied in this study for cross-platform analysis of LCC. Furthermore, LiDAR data were visualized using high-contrast color scales to improve the accuracy of land cover classification methods through image fusion techniques. If high-resolution aerial imagery is available, then it must be downscaled to match the resolution of low-resolution point clouds. Cross-platform data interchangeability was assessed by comparing the interchangeability, which measures the absolute difference in overall accuracy (OA) or macro-F1 by comparing the cross-platform interchangeability. It is noteworthy that relying solely on aerial photographs is inadequate for achieving precise labeling, particularly under limited sunlight conditions that can lead to misclassification. In such cases, LiDAR plays a crucial role in facilitating target recognition. All the approaches (i.e., low-resolution digital imagery, LiDAR-derived imagery and image fusion) present results of over 0.65 OA and of around 0.6 macro-F1. The authors found that the vegetation (bamboo, tree, grass) and road species have comparatively better performance compared with clutter and bare ground species. Given the stated conditions, differences in the species derived from different years (ALB from year 2017 and GLS from year 2020) are the main reason. Because the identification of clutter species includes all the items except for the relative species in this research, RGB-based features of the clutter species cannot be substituted easily because of the 3-year gap compared with other species. Derived from on-site reconstruction, the bare ground species also has a further color change between ALB and GLS that leads to decreased interchangeability. In the case of individual species, without considering seasons and platforms, image fusion can classify bamboo and trees with higher F1 scores compared to low-resolution digital imagery and LiDAR-derived imagery, which has especially proved the cross-platform interchangeability in the high vegetation types. In recent years, high-resolution photography (UAV), high-precision LiDAR measurement (ALB, GLS), and satellite imagery have been used. LiDAR measurement equipment is expensive, and measurement opportunities are limited. Based on this, it would be desirable if ALB and GLS could be continuously classified by Artificial Intelligence, and in this study, the authors investigated such data interchangeability. A unique and crucial aspect of this study is exploring the interchangeability of land cover classification models across different LiDAR platforms.

Keywords: airborne LiDAR bathymetry; cross-platform; deep learning; green LiDAR system; riverine land cover classification (search for similar items in EconPapers)
JEL-codes: Q15 Q2 Q24 Q28 Q5 R14 R52 (search for similar items in EconPapers)
Date: 2025
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2073-445X/14/2/217/pdf (application/pdf)
https://www.mdpi.com/2073-445X/14/2/217/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jlands:v:14:y:2025:i:2:p:217-:d:1572863

Access Statistics for this article

Land is currently edited by Ms. Carol Ma

More articles in Land from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jlands:v:14:y:2025:i:2:p:217-:d:1572863