AssocKD: An Association-Aware Knowledge Distillation Method for Document-Level Event Argument Extraction
Lijun Tan,
Yanli Hu (),
Jianwei Cao and
Zhen Tan
Additional contact information
Lijun Tan: National Key Laboratory of Information Systems Engineering, National University of Defense Technology, Changsha 410073, China
Yanli Hu: National Key Laboratory of Information Systems Engineering, National University of Defense Technology, Changsha 410073, China
Jianwei Cao: National Key Laboratory of Information Systems Engineering, National University of Defense Technology, Changsha 410073, China
Zhen Tan: National Key Laboratory of Information Systems Engineering, National University of Defense Technology, Changsha 410073, China
Mathematics, 2024, vol. 12, issue 18, 1-20
Abstract:
Event argument extraction is a crucial subtask of event extraction, which aims at extracting arguments that correspond to argument roles when given event types. The majority of current document-level event argument extraction works focus on extracting information for only one event at a time without considering the association among events; this is known as document-level single-event extraction. However, the interrelationship among arguments can yield mutual gains in their extraction. Therefore, in this paper, we propose AssocKD, an Association-aware Knowledge Distillation Method for Document-level Event Argument Extraction, which enables the enhancement of document-level multi-event extraction with event association knowledge. Firstly, we introduce an association-aware training task to extract unknown arguments with the given privileged knowledge of relevant arguments, obtaining an association-aware model that can construct both intra-event and inter-event relationships. Secondly, we adopt multi-teacher knowledge distillation to transfer such event association knowledge from the association-aware teacher models to the event argument extraction student model. Our proposed method, AssocKD, is capable of explicitly modeling and efficiently leveraging event association to enhance the extraction of multi-event arguments at the document level. We conduct experiments on RAMS and WIKIEVENTS datasets and observe a significant improvement, thus demonstrating the effectiveness of our method.
Keywords: document-level event argument extraction; multi-event argument extraction; association-aware construction; knowledge distillation (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2024
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/12/18/2901/pdf (application/pdf)
https://www.mdpi.com/2227-7390/12/18/2901/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:12:y:2024:i:18:p:2901-:d:1480037
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().