EconPapers    
Economics at your fingertips  
 

CPS-LSTM: Privacy-Sensitive Entity Adaptive Recognition Model for Power Systems

Hao Zhang, Jing Wang, Xuanyuan Wang, Xuhui Lü, Zhenzhi Guan, Zhenghua Cai () and Hua Zhang ()
Additional contact information
Hao Zhang: State Grid Jibei Electric Power Company Limited, Beijing 100054, China
Jing Wang: Beijing Kedong Electric Power Control System Co., Ltd., Beijing 100192, China
Xuanyuan Wang: State Grid Jibei Electric Power Company Limited, Beijing 100054, China
Xuhui Lü: Beijing Kedong Electric Power Control System Co., Ltd., Beijing 100192, China
Zhenzhi Guan: State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications, Beijing 100876, China
Zhenghua Cai: State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications, Beijing 100876, China
Hua Zhang: State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications, Beijing 100876, China

Energies, 2025, vol. 18, issue 8, 1-22

Abstract: With the widespread application of Android devices in the energy sector, an increasing number of applications rely on SDKs to access privacy-sensitive data, such as device identifiers, location information, energy consumption, and user behavior. However, these data are often stored in different formats and naming conventions, which poses challenges for consistent extraction and identification. Traditional taint analysis methods are inefficient in identifying these entities, hindering the realization of accurate identification. To address this issue, we first propose a high-quality data construction method based on privacy protocols, which includes sentence segmentation, compression encoding, and entity annotation. We then introduce CPS-LSTM (Character-level Privacy-sensitive Entity Adaptive Recognition Model), which enhances the recognition capability of privacy-sensitive entities in mixed Chinese and English text through character-level embedding and word vector fusion. The model features a streamlined architecture, accelerating convergence and enabling parallel sentence processing. Our experimental results demonstrate that CPS-LSTM significantly outperforms the baseline methods in terms of accuracy and recall. The accuracy of CPS-LSTM is 0.09 higher than Lattice LSTM, 0.14 higher than WC-LSTM, and 0.05 higher than FLAT. In terms of recall, CPS-LSTM is 0.07 higher than Lattice LSTM, 0.12 higher than WC-LSTM, and 0.02 higher than FLAT.

Keywords: computer application technology; name entity recognition; privacy data; vocabulary enhancement (search for similar items in EconPapers)
JEL-codes: Q Q0 Q4 Q40 Q41 Q42 Q43 Q47 Q48 Q49 (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/1996-1073/18/8/2013/pdf (application/pdf)
https://www.mdpi.com/1996-1073/18/8/2013/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jeners:v:18:y:2025:i:8:p:2013-:d:1634533

Access Statistics for this article

Energies is currently edited by Ms. Agatha Cao

More articles in Energies from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-04-17
Handle: RePEc:gam:jeners:v:18:y:2025:i:8:p:2013-:d:1634533