EconPapers    
Economics at your fingertips  
 

Human Perceptions of Fairness: A Survey Experiment

Julian Sengewald (), Anissa Schlichter (), Markus Siepermann () and Richard Lackes ()
Additional contact information
Julian Sengewald: Technical University Dortmund
Anissa Schlichter: Technical University Dortmund
Markus Siepermann: University of Applied Sciences
Richard Lackes: Technical University Dortmund

A chapter in Conceptualizing Digital Responsibility for the Information Age, 2025, pp 53-70 from Springer

Abstract: Abstract Algorithmic decision-making (ADM) through automation has benefits but must be implemented responsibly. Several mathematical definitions of fair outcomes exist, but it remains unclear how these align with human perceptions of fairness. We conducted a survey experiment (N = 258) examining common machine-learning definitions of fairness (demographic parity, equal opportunity, and equalized odds) in the context of algorithmic job interview invitations. We find that humans perceive the simple fairness definition of demographic parity as less fair than a more complex one that considers whether the invitees were eligible.

Keywords: Perceived fairness; Survey experiment; Algorithmic decision-making (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:lnichp:978-3-031-80119-8_4

Ordering information: This item can be ordered from
http://www.springer.com/9783031801198

DOI: 10.1007/978-3-031-80119-8_4

Access Statistics for this chapter

More chapters in Lecture Notes in Information Systems and Organization from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-05-19
Handle: RePEc:spr:lnichp:978-3-031-80119-8_4