EconPapers    
Economics at your fingertips  
 

Algorithmic Justice in Child Protection: Statistical Fairness, Social Justice and the Implications for Practice

Emily Keddell
Additional contact information
Emily Keddell: Social and Community Work Programme, School of Social Science, University of Otago, Dunedin 9054, Aotearoa, New Zealand

Social Sciences, 2019, vol. 8, issue 10, 1-22

Abstract: Algorithmic tools are increasingly used in child protection decision-making. Fairness considerations of algorithmic tools usually focus on statistical fairness, but there are broader justice implications relating to the data used to construct source databases, and how algorithms are incorporated into complex sociotechnical decision-making contexts. This article explores how data that inform child protection algorithms are produced and relates this production to both traditional notions of statistical fairness and broader justice concepts. Predictive tools have a number of challenging problems in the child protection context, as the data that predictive tools draw on do not represent child abuse incidence across the population and child abuse itself is difficult to define, making key decisions that become data variable and subjective. Algorithms using these data have distorted feedback loops and can contain inequalities and biases. The challenge to justice concepts is that individual and group rights to non-discrimination become threatened as the algorithm itself becomes skewed, leading to inaccurate risk predictions drawing on spurious correlations. The right to be treated as an individual is threatened when statistical risk is based on a group categorisation, and the rights of families to understand and participate in the decisions made about them is difficult when they have not consented to data linkage, and the function of the algorithm is obscured by its complexity. The use of uninterpretable algorithmic tools may create ‘moral crumple zones’, where practitioners are held responsible for decisions even when they are partially determined by an algorithm. Many of these criticisms can also be levelled at human decision makers in the child protection system, but the reification of these processes within algorithms render their articulation even more difficult, and can diminish other important relational and ethical aims of social work practice.

Keywords: child protection; predictive analytics; rights; social justice; algorithms; decision making (search for similar items in EconPapers)
JEL-codes: A B N P Y80 Z00 (search for similar items in EconPapers)
Date: 2019
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (4)

Downloads: (external link)
https://www.mdpi.com/2076-0760/8/10/281/pdf (application/pdf)
https://www.mdpi.com/2076-0760/8/10/281/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jscscx:v:8:y:2019:i:10:p:281-:d:274114

Access Statistics for this article

Social Sciences is currently edited by Ms. Yvonne Chu

More articles in Social Sciences from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jscscx:v:8:y:2019:i:10:p:281-:d:274114