EconPapers    
Economics at your fingertips  
 

A context-aware enhanced local citation recommendation model integrating SciBERT and self-adaptive attention

Qianqian Wang (), Hao Li (), Mingjie Ma () and Zhenhua Li ()
Additional contact information
Qianqian Wang: China University of Geosciences
Hao Li: China University of Geosciences
Mingjie Ma: China University of Geosciences
Zhenhua Li: China University of Geosciences

Scientometrics, 2025, vol. 130, issue 8, No 12, 4495-4517

Abstract: Abstract With the exponential growth of scientific literature, scientists are confronted with increasingly severe challenges in efficiently identifying suitable references for their research. As a tool that leverages the semantic information of both cited and citing literature to generate a list of relevant articles, citation recommendation has become a necessary means for researchers to find appropriate papers. Among existing approaches, the Dual Local Citation Recommendation (DualLCR) model has emerged as one of the most promising models due to its inclusion of a semantic module for processing local citation contexts and a bibliographic module for incorporating global metadata. However, DualLCR still has two major limitations: first, it fails to fully capture domain-specific terminology and the semantic nuances in complex scientific contexts; second, its deep neural network architecture leads to significant computational overhead. To address these issues, this paper proposes an enhanced context-aware citation recommendation model–SAA-DualLCR (self-adaptive attention-enhanced dual local citation recommendation). This model enhances the semantic module’s understanding of academic terminology by introducing a SciBERT-based (science text of bidirectional encoder representations from transformers) embedding layer, taking advantage of its in-domain vocabulary and full-text scientific pretraining; It reduces the semantic module’s selection bias in interpreting citation contexts by employing a self-adaptive attention module (SAM) to dynamically adjust attention weights to locate the most relevant text segments; Moreover, it lowers the training complexity and memory overhead of both modules by substituting bidirectional long short-term memory (BiLSTM) with bidirectional gated recurrent units (BiGRU). Experimental results in three benchmark datasets (ACL-200, ACL-600 and Refseer) demonstrate that SAA-DualLCR consistently outperforms state-of-the-art baselines, achieving up to 7.1% improvement in Recall@10 and 7.0% in MRR, while reducing training time by approximately 25%.

Keywords: Local citation recommendation; Neural natural language processing; SciBERT; Self-adaptive attention (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
http://link.springer.com/10.1007/s11192-025-05382-3 Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:scient:v:130:y:2025:i:8:d:10.1007_s11192-025-05382-3

Ordering information: This journal article can be ordered from
http://www.springer.com/economics/journal/11192

DOI: 10.1007/s11192-025-05382-3

Access Statistics for this article

Scientometrics is currently edited by Wolfgang Glänzel

More articles in Scientometrics from Springer, Akadémiai Kiadó
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-08-28
Handle: RePEc:spr:scient:v:130:y:2025:i:8:d:10.1007_s11192-025-05382-3