EconPapers    
Economics at your fingertips  
 

Defense-optimized BERT architecture against digital arrest attacks in intelligent systems

Akshat Gaurav (), Vincent Shin-Hung Pan (), Varsha Arya (), Razaz Waheeb Attar (), Amal Hassan Alhazmi (), Ahmed Alhomoud (), Brij B. Gupta () and Kwok Tai Chui ()
Additional contact information
Akshat Gaurav: Montclair
Vincent Shin-Hung Pan: Chaoyang University of Technology
Varsha Arya: Hong Kong Metropolitan University
Razaz Waheeb Attar: Princess Nourah bint Abdulrahman University
Amal Hassan Alhazmi: Princess Nourah bint Abdulrahman University
Ahmed Alhomoud: Northern Border University
Brij B. Gupta: Asia University
Kwok Tai Chui: Hong Kong Metropolitan University

Telecommunication Systems: Modelling, Analysis, Design and Management, 2025, vol. 88, issue 3, No 27, 9 pages

Abstract: Abstract Digital arrest (DA) is a type of spear phishing attack that is recently used by the scammers to extract money from the victims. In DA, the scammer impersonated as the law enforcement officer and gave instructions to the victim through emails, SMS, or telephone. As the scammers copy the format and pattern of official communication; hence these communications are not detected by the conventional phishing detection models, in this context, in this paper we proposed an AI-driven BERT for the detection of DA scam. The proposed model extracts the import information from the communication using multi-heading attention, channel attention, and spatial attention blocks. The proposed model employs multihead attention to extract semantic dependencies and contextual relationships between tokens, while the channel attention module dynamically recalibrates feature map importance to prioritize relevant dimensions. In addition, the incorporation of spatial attention enhances the location of critical DA cues by refining the spatial representation of features. Due to the use of dedicated attention extraction blocks, the proposed model achieved superior performance with an accuracy of 97.8% and an F1 score of 98.3%, significantly outperforming conventional models such as GRU, LSTM and RNN. Furthermore, the proposed model requires 99.7% fewer FLOPs, has 99.6% fewer parameters, and demonstrates a time efficiency improvement of up to 87.8% compared to these traditional models.

Keywords: Digital arrest; BERT; Multi-heading attention; Channel attention; Spatial attention (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
http://link.springer.com/10.1007/s11235-025-01337-4 Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:telsys:v:88:y:2025:i:3:d:10.1007_s11235-025-01337-4

Ordering information: This journal article can be ordered from
http://www.springer.com/journal/11235

DOI: 10.1007/s11235-025-01337-4

Access Statistics for this article

Telecommunication Systems: Modelling, Analysis, Design and Management is currently edited by Muhammad Khan

More articles in Telecommunication Systems: Modelling, Analysis, Design and Management from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-08-19
Handle: RePEc:spr:telsys:v:88:y:2025:i:3:d:10.1007_s11235-025-01337-4