Entity Linking Based on Sentence Representation
Bingjing Jia,
Zhongli Wu,
Pengpeng Zhou,
Bin Wu and
Wei Wang
Complexity, 2021, vol. 2021, 1-9
Abstract:
Entity linking involves mapping ambiguous mentions in documents to the correct entities in a given knowledge base. Most existing methods failed to link when a mention appears multiple times in a document, since the conflict of its contexts in different locations may lead to difficult linking. Sentence representation, which has been studied based on deep learning approaches recently, can be used to resolve the above issue. In this paper, an effective entity linking model is proposed to capture the semantic meaning of the sentences and reduce the noise introduced by different contexts of the same mention in a document. This model first uses the symmetry of the Siamese network to learn the sentence similarity. Then, the attention mechanism is added to improve the interaction between input sentences. To show the effectiveness of our sentence representation model combined with attention mechanism, named ELSR, extensive experiments are conducted on two public datasets. Results illustrate that our model outperforms the baselines and achieves the superior performance.
Date: 2021
References: Add references at CitEc
Citations:
Downloads: (external link)
http://downloads.hindawi.com/journals/complexity/2021/8895742.pdf (application/pdf)
http://downloads.hindawi.com/journals/complexity/2021/8895742.xml (application/xml)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:hin:complx:8895742
DOI: 10.1155/2021/8895742
Access Statistics for this article
More articles in Complexity from Hindawi
Bibliographic data for series maintained by Mohamed Abdelhakeem ().