EconPapers    
Economics at your fingertips  
 

Introduction to Neural Transfer Learning With Transformers for Social Science Text Analysis

Sandra Wankmüller

Sociological Methods & Research, 2024, vol. 53, issue 4, 1676-1752

Abstract: Transformer-based models for transfer learning have the potential to achieve high prediction accuracies on text-based supervised learning tasks with relatively few training data instances. These models are thus likely to benefit social scientists that seek to have as accurate as possible text-based measures, but only have limited resources for annotating training data. To enable social scientists to leverage these potential benefits for their research, this article explains how these methods work, why they might be advantageous, and what their limitations are. Additionally, three Transformer-based models for transfer learning, BERT, RoBERTa, and the Longformer, are compared to conventional machine learning algorithms on three applications. Across all evaluated tasks, textual styles, and training data set sizes, the conventional models are consistently outperformed by transfer learning with Transformers, thereby demonstrating the benefits these models can bring to text-based social science research.

Keywords: natural language processing; deep learning; neural networks; transfer learning; Transformer; BERT (search for similar items in EconPapers)
Date: 2024
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://journals.sagepub.com/doi/10.1177/00491241221134527 (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:sae:somere:v:53:y:2024:i:4:p:1676-1752

DOI: 10.1177/00491241221134527

Access Statistics for this article

More articles in Sociological Methods & Research
Bibliographic data for series maintained by SAGE Publications ().

 
Page updated 2025-03-19
Handle: RePEc:sae:somere:v:53:y:2024:i:4:p:1676-1752