Transformer-Based Deep Neural Language Modeling for Construct-Specific Automatic Item Generation
Björn E. Hommel (),
Franz-Josef M. Wollang,
Veronika Kotova,
Hannes Zacher and
Stefan C. Schmukle
Additional contact information
Björn E. Hommel: Leipzig University
Franz-Josef M. Wollang: magnolia psychometrics GmbH
Veronika Kotova: Technical University of Munich
Hannes Zacher: Leipzig University
Stefan C. Schmukle: Leipzig University
Psychometrika, 2022, vol. 87, issue 2, No 14, 749-772
Abstract:
Abstract Algorithmic automatic item generation can be used to obtain large quantities of cognitive items in the domains of knowledge and aptitude testing. However, conventional item models used by template-based automatic item generation techniques are not ideal for the creation of items for non-cognitive constructs. Progress in this area has been made recently by employing long short-term memory recurrent neural networks to produce word sequences that syntactically resemble items typically found in personality questionnaires. To date, such items have been produced unconditionally, without the possibility of selectively targeting personality domains. In this article, we offer a brief synopsis on past developments in natural language processing and explain why the automatic generation of construct-specific items has become attainable only due to recent technological progress. We propose that pre-trained causal transformer models can be fine-tuned to achieve this task using implicit parameterization in conjunction with conditional generation. We demonstrate this method in a tutorial-like fashion and finally compare aspects of validity in human- and machine-authored items using empirical data. Our study finds that approximately two-thirds of the automatically generated items show good psychometric properties (factor loadings above .40) and that one-third even have properties equivalent to established and highly curated human-authored items. Our work thus demonstrates the practical use of deep neural networks for non-cognitive automatic item generation.
Keywords: automatic item generation; natural language processing; deep learning; neural networks; language modeling (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s11336-021-09823-9 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:psycho:v:87:y:2022:i:2:d:10.1007_s11336-021-09823-9
Ordering information: This journal article can be ordered from
http://www.springer. ... gy/journal/11336/PS2
DOI: 10.1007/s11336-021-09823-9
Access Statistics for this article
Psychometrika is currently edited by Irini Moustaki
More articles in Psychometrika from Springer, The Psychometric Society
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().