EconPapers    
Economics at your fingertips  
 

Improving Typical Urban Land-Use Classification with Active-Passive Remote Sensing and Multi-Attention Modules Hybrid Network: A Case Study of Qibin District, Henan, China

Zhiwen Yang, Hebing Zhang (), Xiaoxuan Lyu and Weibing Du
Additional contact information
Zhiwen Yang: School of Surveying and Land Information Engineering, Henan Polytechnic University, Jiaozuo 454000, China
Hebing Zhang: School of Surveying and Land Information Engineering, Henan Polytechnic University, Jiaozuo 454000, China
Xiaoxuan Lyu: School of Surveying and Land Information Engineering, Henan Polytechnic University, Jiaozuo 454000, China
Weibing Du: School of Surveying and Land Information Engineering, Henan Polytechnic University, Jiaozuo 454000, China

Sustainability, 2022, vol. 14, issue 22, 1-27

Abstract: The study of high-precision land-use classification is essential for the sustainable development of land resources. This study addresses the problem of classification errors in optical remote-sensing images under high surface humidity, cloud cover, and hazy weather. The synthetic aperture radar (SAR) images are sensitive to soil moisture, and the microwave can penetrate clouds, haze, and smoke. By using both the active and passive remote-sensing data, the Sentinel-1A SAR and Sentinel-2B multispectral (MS) images are combined synergistically. The full-band data combining the SAR + MS + spectral indexes is thus constructed. Based on the high dimensionality and heterogeneity of this data set, a new framework (MAM-HybridNet) based on two-dimensional (2D) and three-dimensional (3D) hybrid convolutional neural networks combined with multi-attention modules (MAMs) is proposed for improving the accuracy of land-use classification in cities with high surface humidity. In addition, the same training samples supported by All bands data (SAR + MS + spectral index) are selected and compared with k-Nearest Neighbors (KNN), support vector machine (SVM), 2D convolutional neural networks, 3D convolutional neural networks, and hybridSN classification models to verify the accuracy of the proposed classification model. The results show that (1) fusion classification based on Sentinel-2B MSI and Sentinel-1A SAR data produce an overall accuracy (OA) of 95.10%, a kappa coefficient (KC) of 0.93, and an average accuracy (AA) of 92.86%, which is better than the classification results using Sentinel-2B MSI and Sentinel-1A SAR images separately. (2) The classification accuracy improves upon adding the spectral index, and the OA, KC, and AA improve by 3.77%, 0.05, and 5.5%, respectively. (3) With the support of full-band data, the algorithm proposed herein produces better results than other classification algorithms, with an OA of 98.87%, a KC of 0.98, and an AA of 98.36%. These results indicate that the synergistic effect of active-passive remote-sensing data improves land-use classification. Additionally, the results verify the effectiveness of the proposed deep-learning classification model for land-use classification.

Keywords: land-use classification; convolutional neural networks (CNN); active and passive remote sensing; data fusion (search for similar items in EconPapers)
JEL-codes: O13 Q Q0 Q2 Q3 Q5 Q56 (search for similar items in EconPapers)
Date: 2022
References: View complete reference list from CitEc
Citations: View citations in EconPapers (2)

Downloads: (external link)
https://www.mdpi.com/2071-1050/14/22/14723/pdf (application/pdf)
https://www.mdpi.com/2071-1050/14/22/14723/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jsusta:v:14:y:2022:i:22:p:14723-:d:966849

Access Statistics for this article

Sustainability is currently edited by Ms. Alexandra Wu

More articles in Sustainability from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jsusta:v:14:y:2022:i:22:p:14723-:d:966849