Advancing language models through domain knowledge integration: a comprehensive approach to training, evaluation, and optimization of social scientific neural word embeddings
Fabian Stöhr ()
Additional contact information
Fabian Stöhr: University of Tuebingen
Journal of Computational Social Science, 2024, vol. 7, issue 2, No 23, 1753-1793
Abstract:
Abstract This article proposes a comprehensive strategy for training, evaluating, and optimizing domain-specific word2vec-based word embeddings, using social science literature as an example. Our primary objectives are: (1) to train the embeddings utilizing a corpus of social science text, (2) to test their performance against domain-unspecific embeddings using our developed intrinsic and extrinsic evaluation strategy, and (3) to enhance their performance even further by using domain knowledge. As an integral part of this approach, we present SociRel-461, a domain-knowledge dictionary designed for the intrinsic evaluation and subsequent refinement of social science word embeddings. Using a dataset of 100,000 full-text scientific articles in sociology, we train multiple vector space models, which we then benchmark against a larger, pre-trained general language embedding model as part of our extrinsic evaluation. Furthermore, we developed a transfer learning multi-label classification task for extrinsic evaluation. Our findings reveal that domain-specific embeddings outperform their domain-unspecific counterparts in both intrinsic and extrinsic evaluations. We also investigated the retrofitting post-processing method to enhance domain-unspecific embeddings with the domain knowledge embedded in SociRel-461. While retrofitting does not enhance our domain-specific vector space models, it significantly improves the performance of the domain-unspecific embeddings. This highlights the potential of retrofitting for the transfer of domain knowledge to domain-unspecific embeddings. Our results emphasize the importance of utilizing domain-specific word embeddings for better performance in domain specific transfer learning tasks, as they outperform conventional embeddings trained on everyday language.
Keywords: Natural language processing; Word2vec; Word embedding; Language model (search for similar items in EconPapers)
Date: 2024
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s42001-024-00286-3 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:jcsosc:v:7:y:2024:i:2:d:10.1007_s42001-024-00286-3
Ordering information: This journal article can be ordered from
http://www.springer. ... iences/journal/42001
DOI: 10.1007/s42001-024-00286-3
Access Statistics for this article
Journal of Computational Social Science is currently edited by Takashi Kamihigashi
More articles in Journal of Computational Social Science from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().