High-stakes team based public sector decision making and AI oversight
Deborah Morgan,
Youmna Hashem,
Vincent John Straub and
Jonathan Bright
No arq3w, SocArXiv from Center for Open Science
Abstract:
Oversight mechanisms, whereby the functioning and behaviour of AI systems are controlled to ensure that they are tuned to public benefit, are a core aspect of human-centered AI. They are especially important in public sector AI applications, where decisions on core public services such as education, benefits, and child welfare have significant impacts. Much current thinking on oversight mechanisms revolves around the idea of human decision makers being present ‘in the loop’ of decision making, such that they can insert expert judgment at critical moments and thus rein in the functioning of the machine. While welcome, we believe that the theory of human in the loop oversight has yet to fully engage with the idea that decision making, especially in high-stakes contexts, is often currently made by hierarchical teams rather than one individual. This raises the question of how such hierarchical structures can effectively engage with an AI system that is either supporting or making decisions. In this position paper, we outline some of the key contemporary elements of hierarchical decision making in contemporary public services and show how they relate to current thinking about AI oversight, thus sketching out future research directions for the field.
Date: 2022-12-01
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://osf.io/download/63887d43a98e5f2db41035fb/
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:osf:socarx:arq3w
DOI: 10.31219/osf.io/arq3w
Access Statistics for this paper
More papers in SocArXiv from Center for Open Science
Bibliographic data for series maintained by OSF ().