EconPapers    
Economics at your fingertips  
 

Stability-Optimized Graph Convolutional Network: A Novel Propagation Rule with Constraints Derived from ODEs

Liping Chen, Hongji Zhu and Shuguang Han ()
Additional contact information
Liping Chen: School of Science, Zhejiang Sci-Tech University, Hangzhou 310000, China
Hongji Zhu: School of Computer Science, Zhejiang Sci-Tech University, Hangzhou 310000, China
Shuguang Han: School of Science, Zhejiang Sci-Tech University, Hangzhou 310000, China

Mathematics, 2025, vol. 13, issue 5, 1-13

Abstract: The node representation learning capability of Graph Convolutional Networks (GCNs) is fundamentally constrained by dynamic instability during feature propagation, yet existing research lacks systematic theoretical analysis of stability control mechanisms. This paper proposes a Stability-Optimized Graph Convolutional Network (SO-GCN) that enhances training stability and feature expressiveness in shallow architectures through continuous–discrete dual-domain stability constraints. By constructing continuous dynamical equations for GCNs and rigorously proving conditional stability under arbitrary parameter dimensions using nonlinear operator theory, we establish theoretical foundations. A Precision Weight Parameter Mechanism is introduced to determine critical Frobenius norm thresholds through feature contraction rates, optimized via differentiable penalty terms. Simultaneously, a Dynamic Step-size Adjustment Mechanism regulates propagation steps based on spectral properties of instantaneous Jacobian matrices and forward Euler discretization. Experimental results demonstrate SO-GCN’s superiority: 1.1–10.7% accuracy improvement on homophilic graphs (Cora/CiteSeer) and 11.22–12.09% enhancement on heterophilic graphs (Texas/Chameleon) compared to conventional GCN. Hilbert–Schmidt Independence Criterion (HSIC) analysis reveals SO-GCN’s superior inter-layer feature independence maintenance across 2–7 layers. This study establishes a novel theoretical paradigm for graph network stability analysis, with practical implications for optimizing shallow architectures in real-world applications.

Keywords: stable propagation rule; graph convolutional network; precision weight parameter mechanism; dynamic step-size adjustment mechanism (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/13/5/761/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/5/761/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:5:p:761-:d:1599842

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-22
Handle: RePEc:gam:jmathe:v:13:y:2025:i:5:p:761-:d:1599842