How to Choose a Fairness Measure: A Decision-Making Workflow for Auditors
Federica Picogna,
Jacques de Swart,
Heysem Kaya and
Ruud Wetzels
Additional contact information
Federica Picogna: Nyenrode Business University
Heysem Kaya: Utrecht University
No cpxmf_v2, OSF Preprints from Center for Open Science
Abstract:
Recent developments in Artificial Intelligence (AI) have greatly benefited society, but they also come with risks. One of those risks is that AI has the potential to discriminate against certain groups of people. To address that risk, benchmark regulations such as the AI Act have been cre- ated, requiring AI systems to be fair and tasking auditors with ensuring their compliance. In order to do so, auditors use fairness measures. However, selecting a specific definition of fairness from the various available options and choosing a fairness measure from the numerous possibilities com- plicates the auditing process, making it challenging for auditors to correctly assess AI fairness. To assist them, we created a decision-making workflow that guides the auditor through the selection process of the most appropriate measure and, consequently, the most suitable definition of fairness. To simplify the use of this workflow, we have integrated it into the open-source program JASP for Audit and demonstrated its functionality with two examples: the COMPAS recidivism and the DUO case.
Date: 2025-03-26
References: Add references at CitEc
Citations:
Downloads: (external link)
https://osf.io/download/67e2bbd3763682cadf496fdd/
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:osf:osfxxx:cpxmf_v2
DOI: 10.31219/osf.io/cpxmf_v2
Access Statistics for this paper
More papers in OSF Preprints from Center for Open Science
Bibliographic data for series maintained by OSF ().