Sentiment Analysis Based on Bert and Transformer
Tang Yue () and
Ma Jing
Additional contact information
Tang Yue: College of Economics and Management, Nanjing University of Aeronautics and Astronautics
Ma Jing: College of Economics and Management, Nanjing University of Aeronautics and Astronautics
A chapter in AI and Analytics for Public Health, 2022, pp 317-328 from Springer
Abstract:
Abstract With the development of social media, the scale of comment text on network platform is growing rapidly. How to judge the emotional polarity of comments quickly and accurately is of great significance in the fields of commodity review and public opinion monitoring. Short text reviews have the problem of sparse features. At present, most of the sentiment analysis models use TF-IDF, word2vec and groove to obtain word vector representation, which can’t fully express the context semantic information of the text. Moreover, the mainstream bidirectional recurrent neural network model relies on the sequence information to a large extent, so it is difficult to pay attention to the important information of the text. To solve this problem, this paper proposes a sentiment analysis method which combines the best word vector and the transformer self-attention model. The rich semantic representation information is obtained through the Bert model, and the feature weight is dynamically adjusted by the self-attention model to obtain the sentiment classification of short text. The experimental results show that the Bert word vector can enhance the semantic representation of the text, and the self-attention model can reduce the dependence of external parameters, which makes the model pay more attention to its own key information, and the classification performance is significantly improved. There are two innovations in this paper. Firstly, the problem of sparse features of short text is solved by using the Bert word vector to improve the semantic expression of the text and obtain more comprehensive semantic information; secondly, the feature information is weighted by self-attention mechanism to highlight the important information of the text.
Keywords: Sentiment analysis; Bert; Self-attention; Transformer (search for similar items in EconPapers)
Date: 2022
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:prbchp:978-3-030-75166-1_23
Ordering information: This item can be ordered from
http://www.springer.com/9783030751661
DOI: 10.1007/978-3-030-75166-1_23
Access Statistics for this chapter
More chapters in Springer Proceedings in Business and Economics from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().