Hierarchical Self-Attention Hybrid Sparse Networks for Document Classification
Weichun Huang,
Ziqiang Tao,
Xiaohui Huang,
Liyan Xiong and
Jia Yu
Mathematical Problems in Engineering, 2021, vol. 2021, 1-10
Abstract:
Document classification is a fundamental problem in natural language processing. Deep learning has demonstrated great success in this task. However, most existing models do not involve the sentence structure as a text semantic feature in the architecture and pay less attention to the contexting importance of words and sentences. In this paper, we present a new model based on a sparse recurrent neural network and self-attention mechanism for document classification. Subsequently, we analyze three variant models of GRU and LSTM for evaluating the sparse model in different datasets. Extensive experiments demonstrate that our model obtains competitive performance and outperforms previous models.
Date: 2021
References: Add references at CitEc
Citations:
Downloads: (external link)
http://downloads.hindawi.com/journals/MPE/2021/5594895.pdf (application/pdf)
http://downloads.hindawi.com/journals/MPE/2021/5594895.xml (text/xml)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:hin:jnlmpe:5594895
DOI: 10.1155/2021/5594895
Access Statistics for this article
More articles in Mathematical Problems in Engineering from Hindawi
Bibliographic data for series maintained by Mohamed Abdelhakeem ().