Urban Functional Zone Classification via Advanced Multi-Modal Data Fusion
Tianyu Liu,
Hongbing Chen (),
Junfeng Ren,
Long Zhang,
Hongrui Chen,
Rundong Hong,
Chenshuang Li,
Wenlong Cui,
Wenhua Guo and
Changji Wen ()
Additional contact information
Tianyu Liu: College of Information and Technology, Jilin Agricultural University, Changchun 130118, China
Hongbing Chen: College of Information and Technology, Jilin Agricultural University, Changchun 130118, China
Junfeng Ren: College of Information and Technology, Jilin Agricultural University, Changchun 130118, China
Long Zhang: College of Information and Technology, Jilin Agricultural University, Changchun 130118, China
Hongrui Chen: Computer Technology and Engineering, Changchun Institute of Technology, Changchun 130000, China
Rundong Hong: College of Information and Technology, Jilin Agricultural University, Changchun 130118, China
Chenshuang Li: College of Information and Technology, Jilin Agricultural University, Changchun 130118, China
Wenlong Cui: College of Information and Technology, Jilin Agricultural University, Changchun 130118, China
Wenhua Guo: Information Center of Ministry of Natural Resources, Beijing 100830, China
Changji Wen: College of Information and Technology, Jilin Agricultural University, Changchun 130118, China
Sustainability, 2024, vol. 16, issue 24, 1-26
Abstract:
The classification of urban functional zones is crucial for improving land use efficiency and promoting balanced development across urban areas. Existing methods for classifying urban functional zones using mobile signaling data face challenges primarily due to the limitations of single data sources, insufficient utilization of multidimensional data, and inherent inaccuracies in mobile signaling data. To address these issues, this study proposes an innovative classification method that employs advanced multimodal data fusion techniques to enhance the accuracy and reliability of functional zone classification. Mobile signaling data are mapped into image data using timestamp and geographic location information and combined with point of interest (POI) data to construct a comprehensive multimodal dataset. Deep learning techniques are then applied to fuse the multimodal data features, enabling precise and reliable classification of functional zones. The experimental results demonstrate that this method achieves an accuracy of 95.128% in classifying urban functional zones, significantly outperforming methods that use single-modal data.
Keywords: urban functional zone classification; mobile signaling data; deep learning; multimodal data; features fusion; features extraction (search for similar items in EconPapers)
JEL-codes: O13 Q Q0 Q2 Q3 Q5 Q56 (search for similar items in EconPapers)
Date: 2024
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2071-1050/16/24/11145/pdf (application/pdf)
https://www.mdpi.com/2071-1050/16/24/11145/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jsusta:v:16:y:2024:i:24:p:11145-:d:1547547
Access Statistics for this article
Sustainability is currently edited by Ms. Alexandra Wu
More articles in Sustainability from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().