EconPapers    
Economics at your fingertips  
 

Attributed Graph Embedding with Random Walk Regularization and Centrality-Based Attention

Yuxuan Yang, Beibei Han, Zanxi Ran, Min Gao and Yingmei Wei ()
Additional contact information
Yuxuan Yang: School of System Engineering, National University of Defense Technology, Changsha 410073, China
Beibei Han: School of System Engineering, National University of Defense Technology, Changsha 410073, China
Zanxi Ran: School of System Engineering, National University of Defense Technology, Changsha 410073, China
Min Gao: School of System Engineering, National University of Defense Technology, Changsha 410073, China
Yingmei Wei: School of System Engineering, National University of Defense Technology, Changsha 410073, China

Mathematics, 2023, vol. 11, issue 8, 1-14

Abstract: Graph-embedding learning is the foundation of complex information network analysis, aiming to represent nodes in a graph network as low-dimensional dense real-valued vectors for the application in practical analysis tasks. In recent years, the study of graph network representation learning has received increasing attention from researchers, and, among them, graph neural networks (GNNs) based on deep learning are playing an increasingly important role in this field. However, the fact that higher-order neighborhood information cannot be used effectively is a problem of most existing graph neural networks. Moreover, it tends to ignore the influence of latent representation and structural properties on graph embedding. In hopes of solving these issues, we introduce centrality encoding to learn the node properties, add an attention mechanism consideration to better distinguish the significance of neighboring nodes, and introduce random walk regularization to make sample neighbors that consistently satisfy predetermined criteria. This allows us to learn a representation of a potential node. We tested the performance of our model on node-clustering and link prediction tasks using three widely recognized benchmark datasets. The outcomes of our experiments demonstrate that our model significantly surpasses the baseline method in both tasks, indicating that the graph embedding it generates is highly expressive.

Keywords: attributed graph embedding; attributed network; graph representation learning; graph neural networks (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/11/8/1830/pdf (application/pdf)
https://www.mdpi.com/2227-7390/11/8/1830/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:11:y:2023:i:8:p:1830-:d:1121680

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jmathe:v:11:y:2023:i:8:p:1830-:d:1121680