EconPapers    
Economics at your fingertips  
 

Multi-Self-Attention for Aspect Category Detection and Biomedical Multilabel Text Classification with BERT

Xuelei Zhang, Xinyu Song, Ao Feng and Zhengjie Gao

Mathematical Problems in Engineering, 2021, vol. 2021, 1-6

Abstract:

Multilabel classification is one of the most challenging tasks in natural language processing, posing greater technical difficulties than single-label classification. At the same time, multilabel classification has more natural applications. For individual labels, the whole piece of text has different focuses or component distributions, which require full use of local information of the sentence. As a widely adopted mechanism in natural language processing, attention becomes a natural choice for the issue. This paper proposes a multilayer self-attention model to deal with aspect category and word attention at different granularities. Combined with the BERT pretraining model, it achieves competitive performance in aspect category detection and electronic medical records’ classification.

Date: 2021
References: Add references at CitEc
Citations:

Downloads: (external link)
http://downloads.hindawi.com/journals/MPE/2021/6658520.pdf (application/pdf)
http://downloads.hindawi.com/journals/MPE/2021/6658520.xml (text/xml)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:hin:jnlmpe:6658520

DOI: 10.1155/2021/6658520

Access Statistics for this article

More articles in Mathematical Problems in Engineering from Hindawi
Bibliographic data for series maintained by Mohamed Abdelhakeem ().

 
Page updated 2025-03-19
Handle: RePEc:hin:jnlmpe:6658520