Graph-to-Text Generation with Bidirectional Dual Cross-Attention and Concatenation
Elias Lemuye Jimale,
Wenyu Chen (),
Mugahed A. Al-antari (),
Yeong Hyeon Gu (),
Victor Kwaku Agbesi,
Wasif Feroze,
Feidu Akmel,
Juhar Mohammed Assefa and
Ali Shahzad
Additional contact information
Elias Lemuye Jimale: School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Wenyu Chen: School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Mugahed A. Al-antari: Department of Artificial Intelligence and Data Science, College of AI Convergence, Sejong University, Seoul 05006, Republic of Korea
Yeong Hyeon Gu: Department of Artificial Intelligence and Data Science, College of AI Convergence, Sejong University, Seoul 05006, Republic of Korea
Victor Kwaku Agbesi: School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Wasif Feroze: School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Feidu Akmel: School of Information and Communication Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Juhar Mohammed Assefa: School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Ali Shahzad: School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Mathematics, 2025, vol. 13, issue 6, 1-21
Abstract:
Graph-to-text generation (G2T) involves converting structured graph data into natural language text, a task made challenging by the need for encoders to capture the entities and their relationships within the graph effectively. While transformer-based encoders have advanced natural language processing, their reliance on linearized data often obscures the complex interrelationships in graph structures, leading to structural loss. Conversely, graph attention networks excel at capturing graph structures but lack the pre-training advantages of transformers. To leverage the strengths of both modalities and bridge this gap, we propose a novel bidirectional dual cross-attention and concatenation (BDCC) mechanism that integrates outputs from a transformer-based encoder and a graph attention encoder. The bidirectional dual cross-attention computes attention scores bidirectionally, allowing graph features to attend to transformer features and vice versa, effectively capturing inter-modal relationships. The concatenation is applied to fuse the attended outputs, enabling robust feature fusion across modalities. We empirically validate BDCC on PathQuestions and WebNLG benchmark datasets, achieving BLEU scores of 67.41% and 66.58% and METEOR scores of 49.63% and 47.44%, respectively. The results outperform the baseline models and demonstrate that BDCC significantly improves G2T tasks by leveraging the synergistic benefits of graph attention and transformer encoders, addressing the limitations of existing approaches and showcasing the potential for future research in this area.
Keywords: data-to-text generation; graph-to-text generation; graph neural network; graph attention; knowledge graph; language models; natural language generation; text generation; cross-attention (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/13/6/935/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/6/935/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:6:p:935-:d:1610187
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().