Social Biases in AI-Generated Creative Texts: A Mixed-Methods Approach in the Spanish Context
María Gabino-Campos (),
José I. Baile and
Aura Padilla-Martínez
Additional contact information
María Gabino-Campos: Department of Communication Sciences and Social Work, University of La Laguna, 38200 La Laguna, Spain
José I. Baile: Department of Psychology, Faculty of Health Sciences and Psychology, Madrid Open University, 28400 Madrid, Spain
Aura Padilla-Martínez: Department of Journalism and Global Communication, Faculty of Information Sciences, Complutense University, 28040 Madrid, Spain
Social Sciences, 2025, vol. 14, issue 3, 1-12
Abstract:
This study addresses the biases in artificial intelligence (AI) when generating creative content, a growing challenge due to the widespread adoption of these technologies in creating automated narratives. Biases in AI reflect and amplify social inequalities. They perpetuate stereotypes and limit diverse representation in the generated outputs. Through an experimental approach with ChatGPT-4, biases related to age, gender, sexual orientation, ethnicity, religion, physical appearance, and socio-economic status, are analyzed in AI-generated stories about successful individuals in the context of Spain. The results reveal an overrepresentation of young, heterosexual, and Hispanic characters, alongside a marked underrepresentation of diverse groups such as older individuals, ethnic minorities, and characters with varied socio-economic backgrounds. These findings validate the hypothesis that AI systems replicate and amplify the biases present in their training data. This process reinforces social inequalities. To mitigate these effects, the study suggests solutions such as diversifying training datasets and conducting regular ethical audits, with the aim of fostering more inclusive AI systems. These measures seek to ensure that AI technologies fairly represent human diversity and contribute to a more equitable society.
Keywords: algorithmic biases; AI ethics; AI narrative analysis; gender stereotypes; age biases; ethnic biases; social representation; training datasets; physical appearance; socio-economic status (search for similar items in EconPapers)
JEL-codes: A B N P Y80 Z00 (search for similar items in EconPapers)
Date: 2025
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2076-0760/14/3/170/pdf (application/pdf)
https://www.mdpi.com/2076-0760/14/3/170/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jscscx:v:14:y:2025:i:3:p:170-:d:1609781
Access Statistics for this article
Social Sciences is currently edited by Ms. Yvonne Chu
More articles in Social Sciences from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().