EconPapers    
Economics at your fingertips  
 

Intent-Controllable Citation Text Generation

Shing-Yun Jung, Ting-Han Lin, Chia-Hung Liao, Shyan-Ming Yuan and Chuen-Tsai Sun
Additional contact information
Shing-Yun Jung: Department of Computer Science, National Yang Ming Chiao Tung University, Hsinchu 300, Taiwan
Ting-Han Lin: Department of Computer Science, National Yang Ming Chiao Tung University, Hsinchu 300, Taiwan
Chia-Hung Liao: Department of Computer Science, National Yang Ming Chiao Tung University, Hsinchu 300, Taiwan
Shyan-Ming Yuan: Department of Computer Science, National Yang Ming Chiao Tung University, Hsinchu 300, Taiwan
Chuen-Tsai Sun: Department of Computer Science, National Yang Ming Chiao Tung University, Hsinchu 300, Taiwan

Mathematics, 2022, vol. 10, issue 10, 1-17

Abstract: We study the problem of controllable citation text generation by introducing a new concept to generate citation texts. Citation text generation, as an assistive writing approach, has drawn a number of researchers’ attention. However, current research related to citation text generation rarely addresses how to generate the citation texts that satisfy the specified citation intents by the paper’s authors, especially at the beginning of paper writing. We propose a controllable citation text generation model that extends a pre-trained sequence to sequence models, namely, BART and T5, by using the citation intent as the control code to generate the citation text, meeting the paper authors’ citation intent. Experimental results demonstrate that our model can generate citation texts semantically similar to the reference citation texts and satisfy the given citation intent. Additionally, the results from human evaluation also indicate that incorporating the citation intent may enable the models to generate relevant citation texts almost as scientific paper authors do, even when only a little information from the citing paper is available.

Keywords: citation text generation; citation intent; controllable text generation; pre-trained sequence-to-sequence model; natural language processing (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/10/10/1763/pdf (application/pdf)
https://www.mdpi.com/2227-7390/10/10/1763/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:10:y:2022:i:10:p:1763-:d:820768

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jmathe:v:10:y:2022:i:10:p:1763-:d:820768