EconPapers    
Economics at your fingertips  
 

Biased Humans, (Un)Biased Algorithms?

Florian Pethig () and Julia Kroenung ()
Additional contact information
Florian Pethig: Business School, University of Mannheim
Julia Kroenung: European Business School

Journal of Business Ethics, 2023, vol. 183, issue 3, No 1, 637-652

Abstract: Abstract Previous research has shown that algorithmic decisions can reflect gender bias. The increasingly widespread utilization of algorithms in critical decision-making domains (e.g., healthcare or hiring) can thus lead to broad and structural disadvantages for women. However, women often experience bias and discrimination through human decisions and may turn to algorithms in the hope of receiving neutral and objective evaluations. Across three studies (N = 1107), we examine whether women’s receptivity to algorithms is affected by situations in which they believe that their gender identity might disadvantage them in an evaluation process. In Study 1, we establish, in an incentive-compatible online setting, that unemployed women are more likely to choose to have their employment chances evaluated by an algorithm if the alternative is an evaluation by a man rather than a woman. Study 2 generalizes this effect by placing it in a hypothetical hiring context, and Study 3 proposes that relative algorithmic objectivity, i.e., the perceived objectivity of an algorithmic evaluator over and against a human evaluator, is a driver of women’s preferences for evaluations by algorithms as opposed to men. Our work sheds light on how women make sense of algorithms in stereotype-relevant domains and exemplifies the need to provide education for those at risk of being adversely affected by algorithmic decisions. Our results have implications for the ethical management of algorithms in evaluation settings. We advocate for improving algorithmic literacy so that evaluators and evaluatees (e.g., hiring managers and job applicants) can acquire the abilities required to reflect critically on algorithmic decisions.

Keywords: Algorithms; Gender bias; Stigma; Objectivity (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)

Downloads: (external link)
http://link.springer.com/10.1007/s10551-022-05071-8 Abstract (text/html)
Access to full text is restricted to subscribers.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:kap:jbuset:v:183:y:2023:i:3:d:10.1007_s10551-022-05071-8

Ordering information: This journal article can be ordered from
http://www.springer. ... cs/journal/10551/PS2

DOI: 10.1007/s10551-022-05071-8

Access Statistics for this article

Journal of Business Ethics is currently edited by Michelle Greenwood and R. Edward Freeman

More articles in Journal of Business Ethics from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-19
Handle: RePEc:kap:jbuset:v:183:y:2023:i:3:d:10.1007_s10551-022-05071-8