EconPapers    
Economics at your fingertips  
 

Convolutional Neural Network Outperforms Graph Neural Network on the Spatially Variant Graph Data

Anna Boronina, Vladimir Maksimenko and Alexander E. Hramov ()
Additional contact information
Anna Boronina: Center for Technologies in Robotics and Mechatronics Components, Innopolis University, 420500 Innopolis, Russia
Vladimir Maksimenko: Center for Technologies in Robotics and Mechatronics Components, Innopolis University, 420500 Innopolis, Russia
Alexander E. Hramov: Engineering School of Information Technologies, Telecommunications and Control Systems, Ural Federal University, 620002 Ekaterinburg, Russia

Mathematics, 2023, vol. 11, issue 11, 1-13

Abstract: Applying machine learning algorithms to graph-structured data has garnered significant attention in recent years due to the prevalence of inherent graph structures in real-life datasets. However, the direct application of traditional deep learning algorithms, such as Convolutional Neural Networks (CNNs), is limited as they are designed for regular Euclidean data like 2D grids and 1D sequences. In contrast, graph-structured data are in a non-Euclidean form. Graph Neural Networks (GNNs) are specifically designed to handle non-Euclidean data and make predictions based on connectivity rather than spatial structure. Real-life graph data can be broadly categorized into two types: spatially-invariant graphs, where the link structure between nodes is independent of their spatial positions, and spatially-variant graphs, where node positions provide additional information about the graph’s properties. However, there is limited understanding of the effect of spatial variance on the performance of Graph Neural Networks. In this study, we aim to address this issue by comparing the performance of GNNs and CNNs on spatially-variant and spatially-invariant graph data. In the case of spatially-variant graphs, when represented as adjacency matrices, they can exhibit Euclidean-like spatial structure. Based on this distinction, we hypothesize that CNNs may outperform GNNs when working with spatially-variant graphs, while GNNs may excel on spatially-invariant graphs. To test this hypothesis, we compared the performance of CNNs and GNNs under two scenarios: (i) graphs in the training and test sets had the same connectivity pattern and spatial structure, and (ii) graphs in the training and test sets had the same connectivity pattern but different spatial structures. Our results confirmed that the presence of spatial structure in a graph allows for the effective use of CNNs, which may even outperform GNNs. Thus, our study contributes to the understanding of the effect of spatial graph structure on the performance of machine learning methods and allows for the selection of an appropriate algorithm based on the spatial properties of the real-life graph dataset.

Keywords: graph neural network (GNN); convolutional neural network (CNN); classification; graph structures; adjacency matrix; modularity; segregation; clustering; spatial invariance (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/11/11/2515/pdf (application/pdf)
https://www.mdpi.com/2227-7390/11/11/2515/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:11:y:2023:i:11:p:2515-:d:1159700

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jmathe:v:11:y:2023:i:11:p:2515-:d:1159700