Robust Graph Neural Networks via Ensemble Learning
Qi Lin,
Shuo Yu,
Ke Sun,
Wenhong Zhao,
Osama Alfarraj,
Amr Tolba and
Feng Xia
Additional contact information
Qi Lin: School of Software, Dalian University of Technology, Dalian 116620, China
Shuo Yu: School of Software, Dalian University of Technology, Dalian 116620, China
Ke Sun: School of Software, Dalian University of Technology, Dalian 116620, China
Wenhong Zhao: Ultraprecision Machining Center, Zhejiang University of Technology, Hangzhou 310014, China
Osama Alfarraj: Computer Science Department, Community College, King Saud University, Riyadh 11437, Saudi Arabia
Amr Tolba: Computer Science Department, Community College, King Saud University, Riyadh 11437, Saudi Arabia
Feng Xia: School of Engineering, IT and Physical Sciences, Federation University Australia, Ballarat, VIC 3353, Australia
Mathematics, 2022, vol. 10, issue 8, 1-14
Abstract:
Graph neural networks (GNNs) have demonstrated a remarkable ability in the task of semi-supervised node classification. However, most existing GNNs suffer from the nonrobustness issues, which poses a great challenge for applying GNNs into sensitive scenarios. Some researchers concentrate on constructing an ensemble model to mitigate the nonrobustness issues. Nevertheless, these methods ignore the interaction among base models, leading to similar graph representations. Moreover, due to the deterministic propagation applied in most existing GNNs, each node highly relies on its neighbors, leaving the nodes to be sensitive to perturbations. Therefore, in this paper, we propose a novel framework of graph ensemble learning based on knowledge passing (called GEL) to address the above issues. In order to achieve interaction, we consider the predictions of prior models as knowledge to obtain more reliable predictions. Moreover, we design a multilayer DropNode propagation strategy to reduce each node’s dependence on particular neighbors. This strategy also empowers each node to aggregate information from diverse neighbors, alleviating oversmoothing issues. We conduct experiments on three benchmark datasets, including Cora, Citeseer, and Pubmed. GEL outperforms GCN by more than 5% in terms of accuracy across all three datasets and also performs better than other state-of-the-art baselines. Extensive experimental results also show that the GEL alleviates the nonrobustness and oversmoothing issues.
Keywords: graph neural networks; graph learning; ensemble learning; multilayer DropNode propagation; knowledge passing (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (3)
Downloads: (external link)
https://www.mdpi.com/2227-7390/10/8/1300/pdf (application/pdf)
https://www.mdpi.com/2227-7390/10/8/1300/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:10:y:2022:i:8:p:1300-:d:793586
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().