EconPapers    
Economics at your fingertips  
 

On the effectiveness of limited-data large language model fine-tuning for Arabic

Mohamed Alkaoud

PLOS ONE, 2025, vol. 20, issue 10, 1-26

Abstract: This paper presents an investigation into fine-tuning large language models (LLMs) for Arabic natural language processing (NLP) tasks. Although recent multilingual LLMs have made remarkable progress in zero-shot and few-shot settings, specialized models such as fine-tuned BERT variants continue to define state-of-the-art (SOTA) performance in many Arabic tasks. We demonstrate that by fine-tuning a general-purpose LLM (GPT-4o mini) on only a small subset (3.0%–7.5%) of the training samples, we exceed previous best reported results in sentiment analysis (ArSAS) and sarcasm detection (ArSarcasm), while achieving performance statistically comparable to the SOTA in news categorization (ASND). This study highlights that LLMs, when properly adapted, can outperform established models without relying on full-scale annotated training sets. Furthermore, our analysis with the open-source Gemma-3-27B model confirms the generalizability of our data-efficient method. Notably, this approach enabled the model to achieve performance statistically comparable to SOTA on all three tasks, although the proprietary GPT-4o mini maintained an overall performance advantage. We further compare GPT-4o with GPT-4o mini to examine the impact of model size on fine-tuning. GPT-4o outperforms GPT-4o mini across all sample sizes but by small margins (

Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0332419 (text/html)
https://journals.plos.org/plosone/article/file?id= ... 32419&type=printable (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:plo:pone00:0332419

DOI: 10.1371/journal.pone.0332419

Access Statistics for this article

More articles in PLOS ONE from Public Library of Science
Bibliographic data for series maintained by plosone ().

 
Page updated 2025-10-12
Handle: RePEc:plo:pone00:0332419