EconPapers    
Economics at your fingertips  
 

DecoStrat: Leveraging the Capabilities of Language Models in D2T Generation via Decoding Framework

Elias Lemuye Jimale, Wenyu Chen (), Mugahed A. Al-antari (), Yeong Hyeon Gu (), Victor Kwaku Agbesi and Wasif Feroze
Additional contact information
Elias Lemuye Jimale: School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Wenyu Chen: School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Mugahed A. Al-antari: Department of Artificial Intelligence and Data Science, College of AI Convergence, Daeyang AI Center, Sejong University, Seoul 05006, Republic of Korea
Yeong Hyeon Gu: Department of Artificial Intelligence and Data Science, College of AI Convergence, Daeyang AI Center, Sejong University, Seoul 05006, Republic of Korea
Victor Kwaku Agbesi: School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Wasif Feroze: School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China

Mathematics, 2024, vol. 12, issue 22, 1-26

Abstract: Current language models have achieved remarkable success in NLP tasks. Nonetheless, individual decoding methods face difficulties in realizing the immense potential of these models. The challenge is primarily due to the lack of a decoding framework that can integrate language models and decoding methods. We introduce DecoStrat, which bridges the gap between language modeling and the decoding process in D2T generation. By leveraging language models, DecoStrat facilitates the exploration of alternative decoding methods tailored to specific tasks. We fine-tuned the model on the MultiWOZ dataset to meet task-specific requirements and employed it to generate output(s) through multiple interactive modules of the framework. The Director module orchestrates the decoding processes, engaging the Generator to produce output(s) text based on the selected decoding method and input data. The Manager module enforces a selection strategy, integrating Ranker and Selector to identify the optimal result. Evaluations on the stated dataset show that DecoStrat effectively produces a diverse and accurate output, with MBR variants consistently outperforming other methods. DecoStrat with the T5-small model surpasses some baseline frameworks. Generally, the findings highlight DecoStrat’s potential for optimizing decoding methods in diverse real-world applications.

Keywords: decoding methods; data-to-text generation (D2T); language models (LMs); natural language generation (NLG); natural language processing (NLP) (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2024
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/12/22/3596/pdf (application/pdf)
https://www.mdpi.com/2227-7390/12/22/3596/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:12:y:2024:i:22:p:3596-:d:1522881

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jmathe:v:12:y:2024:i:22:p:3596-:d:1522881