Harnessing Attention-Based Graph Recurrent Neural Networks for Enhanced Conversational Flow Prediction via Conversational Graph Construction
R. Sujatha () and
K. Nimala
Additional contact information
R. Sujatha: Department of Networking and Communications, SRM Institute of Science and Technology, Kattankulathur, Chengalpattu 603203, Tamilnadu, India
K. Nimala: Department of Networking and Communications, SRM Institute of Science and Technology, Kattankulathur, Chengalpattu 603203, Tamilnadu, India
Journal of Information & Knowledge Management (JIKM), 2024, vol. 23, issue 03, 1-17
Abstract:
Conversational flow refers to the progression of a conversation, encompassing the arrangement of topics discussed and how responses are delivered. A smooth flow involves participants taking turns to speak and respond naturally and intuitively. Conversely, a more disjointed flow may entail prolonged pauses or difficulties establishing common ground. Numerous factors influence conversation flow, including the personalities of those involved, their familiarity with each other, and the contextual setting. A conversational graph pattern outlines how a conversation typically unfolds or the underlying structure it adheres to. It involves combining different sentence types, the sequential order of topics discussed, and the roles played by different individuals. Predicting subsequent sentences relies on predefined patterns, the context derived from prior conversation flow in the data, and the trained system. The accuracy of sentence predictions varies based on the probability of identifying sentences that fit the subsequent pattern. We employ the Graph Recurrent Neural Network with Attention (GRNNA) model to generate conversational graphs and perform next-sentence prediction. This model constructs a conversational graph using an adjacency matrix, node features (sentences), and edge features (semantic similarity between the sentences). The proposed approach leverages attention mechanisms, recurrent updates, and information aggregation from neighbouring nodes to predict the next node (sentence). The model achieves enhanced predictive capabilities by updating node representations through multiple iterations of message passing and recurrent updates. Experimental results using the conversation dataset demonstrate that the GRNNA model surpasses the Graph Neural Network (GNN) model in next-sentence prediction, achieving an impressive accuracy of 98.89%.
Keywords: Attention mechanism; conversational graph; conversational flow; graph attention network; graph neural network; recurrent neural network; sentence classification (search for similar items in EconPapers)
Date: 2024
References: Add references at CitEc
Citations:
Downloads: (external link)
http://www.worldscientific.com/doi/abs/10.1142/S0219649224500382
Access to full text is restricted to subscribers
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:wsi:jikmxx:v:23:y:2024:i:03:n:s0219649224500382
Ordering information: This journal article can be ordered from
DOI: 10.1142/S0219649224500382
Access Statistics for this article
Journal of Information & Knowledge Management (JIKM) is currently edited by Professor Suliman Hawamdeh
More articles in Journal of Information & Knowledge Management (JIKM) from World Scientific Publishing Co. Pte. Ltd.
Bibliographic data for series maintained by Tai Tone Lim ().