Switching Self-Attention Text Classification Model with Innovative Reverse Positional Encoding for Right-to-Left Languages: A Focus on Arabic Dialects
Laith H. Baniata () and
Sangwoo Kang ()
Additional contact information
Laith H. Baniata: School of Computing, Gachon University, Seongnam 13120, Republic of Korea
Sangwoo Kang: School of Computing, Gachon University, Seongnam 13120, Republic of Korea
Mathematics, 2024, vol. 12, issue 6, 1-26
Abstract:
Transformer models have emerged as frontrunners in the field of natural language processing, primarily due to their adept use of self-attention mechanisms to grasp the semantic linkages between words in sequences. Despite their strengths, these models often face challenges in single-task learning scenarios, particularly when it comes to delivering top-notch performance and crafting strong latent feature representations. This challenge is more pronounced in the context of smaller datasets and is particularly acute for under-resourced languages such as Arabic. In light of these challenges, this study introduces a novel methodology for text classification of Arabic texts. This method harnesses the newly developed Reverse Positional Encoding (RPE) technique. It adopts an inductive-transfer learning (ITL) framework combined with a switching self-attention shared encoder, thereby increasing the model’s adaptability and improving its sentence representation accuracy. The integration of Mixture of Experts (MoE) and RPE techniques empowers the model to process longer sequences more effectively. This enhancement is notably beneficial for Arabic text classification, adeptly supporting both the intricate five-point and the simpler ternary classification tasks. The empirical evidence points to its outstanding performance, achieving accuracy rates of 87.20% for the HARD dataset, 72.17% for the BRAD dataset, and 86.89% for the LABR dataset, as evidenced by the assessments conducted on these datasets.
Keywords: switching self-attention; reverse positional encoding (RPE) mothed; text classification (TC); right-to-left text; five-polarity; ITL (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2024
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/12/6/865/pdf (application/pdf)
https://www.mdpi.com/2227-7390/12/6/865/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:12:y:2024:i:6:p:865-:d:1357766
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().