EconPapers    
Economics at your fingertips  
 

REKP: Refined External Knowledge into Prompt-Tuning for Few-Shot Text Classification

Yuzhuo Dang, Weijie Chen, Xin Zhang and Honghui Chen ()
Additional contact information
Yuzhuo Dang: Science and Technology on Information Systems Engineering Laboratory, National University of Defense Technology, Changsha 410073, China
Weijie Chen: Science and Technology on Information Systems Engineering Laboratory, National University of Defense Technology, Changsha 410073, China
Xin Zhang: Science and Technology on Information Systems Engineering Laboratory, National University of Defense Technology, Changsha 410073, China
Honghui Chen: Science and Technology on Information Systems Engineering Laboratory, National University of Defense Technology, Changsha 410073, China

Mathematics, 2023, vol. 11, issue 23, 1-16

Abstract: Text classification is a machine learning technique employed to assign a given text to predefined categories, facilitating the automatic analysis and processing of textual data. However, an important problem is that the number of new text categories is growing faster than that of human annotation data, which makes many new categories of text data lack a lot of annotation data. As a result, the conventional deep neural network is forced to over-fit, which damages the application in the real world. As a solution to this problem, academics recommend addressing data scarcity through few-shot learning. One of the efficient methods is prompt-tuning, which transforms the input text into a mask prediction problem featuring [MASK]. By utilizing descriptors, the model maps output words to labels, enabling accurate prediction. Nevertheless, the previous prompt-based adaption approaches often relied on manually produced verbalizers or a single label to represent the entire label vocabulary, which makes the mapping granularity low, resulting in words not being accurately mapped to their label. To address these issues, we propose to enhance the verbalizer and construct the refined external knowledge into a prompt-tuning (REKP) model. We employ the external knowledge bases to increase the mapping space of tagged terms and design three refinement methods to remove noise data. We conduct comprehensive experiments on four benchmark datasets, namely AG’s News, Yahoo, IMDB, and Amazon. The results demonstrate that REKP can outperform the state-of-the-art baselines in terms of Micro-F1 on knowledge-enhanced text classification. In addition, we conduct an ablation study to ascertain the functionality of each module in our model, revealing that the refinement module significantly contributes to enhancing classification accuracy.

Keywords: few-shot learning; text classification; prompt learning; pre-trained language model (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/11/23/4780/pdf (application/pdf)
https://www.mdpi.com/2227-7390/11/23/4780/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:11:y:2023:i:23:p:4780-:d:1288550

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jmathe:v:11:y:2023:i:23:p:4780-:d:1288550