Learning without forgetting in NLP: Deep learning based transformer models for lifelong task incremental learning
Ambi Rachel Alex (),
Malliga Subramanian (),
Jayanth J R (),
Keerthi Bala A T () and
Taniya Mukherjee ()
Edelweiss Applied Science and Technology, 2025, vol. 9, issue 11, 448-466
Abstract:
This study investigates task incremental learning for developing transformer-based natural language processing (NLP) models capable of sequentially learning new tasks without forgetting previously acquired knowledge. It focuses on enabling models to learn from abusive comment detection to offensive comment detection while maintaining previous knowledge. Pre-trained transformer models such as ERT, ALBERT, RoBERTa, and DistilBERT were integrated with continual learning strategies, including rehearsal, pseudo-rehearsal, and regularization-based methods. Their effectiveness was evaluated on sequential abusive and offensive comment detection tasks using measures of classification accuracy and knowledge retention across multiple datasets. Among the evaluated models, BERT integrated with the Learning Without Forgetting (LwF) approach achieved the best trade-off between stability and plasticity, attaining accuracies of 91.06% for abusive and 98.8% for offensive comment detection. This demonstrates the ability of continual learning to avoid catastrophic forgetting in sequential NLP tasks. Task incremental learning enables transformer models to adapt to new linguistic challenges while retaining prior knowledge, supporting lifelong learning. The proposed framework provides valuable insights for building adaptable NLP systems applicable to content moderation, toxic speech detection, and other evolving language-based applications.
Keywords: Abusive task; Learning without forgetting; Offensive dataset; Task incremental learning; Transformer models. (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://learning-gate.com/index.php/2576-8484/article/view/10905/3502 (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:ajp:edwast:v:9:y:2025:i:11:p:448-466:id:10905
Access Statistics for this article
More articles in Edelweiss Applied Science and Technology from Learning Gate
Bibliographic data for series maintained by Melissa Fernandes ().