When noise mitigates bias in human–algorithm decision-making: An agent-based model
Spencer Poodiack Parsons and
René Torenvlied
PLOS ONE, 2025, vol. 20, issue 12, 1-20
Abstract:
Algorithmic systems increasingly inform human decision-making in domains such as criminal justice, healthcare, and finance. Although algorithms can exhibit bias, they are much less prone to undesirable variability in judgments (noise) than human decision-makers. While presented as an advantageous feature of algorithmic advice, we actually know little about how (biased) algorithmic advice interacts with noisy human judgment. Does undesirable variability in human judgment decrease under noiseless algorithmic advice? Is bias in human judgment exacerbated or mitigated by noise in advice? To answer these questions, we built an agent-based model that simulates the judgment of decision-makers receiving guidance from a (more or less) biased algorithm or a (more or less) biased and noisy human advisor. The model simulations show that, contrary to expectations, noise can be desirable: human noise can mitigate the harms of algorithmic bias by dampening the influence of algorithmic advice. Noise in human advice leads decision-makers to rely more heavily on their prior beliefs, an emergent behavior with implications for belief updating. When decision-makers’ prior beliefs are polarized, an asymmetry occurs: decision-makers respond only to interventionist advice and not to non-interventionist cues. Finally, the model simulations show that population-level variability in decision-making stems from occasion noise in the environment and not from noise in human advice. This result challenges the common wisdom that population-level noise can be straightforwardly decomposed into individual-level sources and questions the feasibility of noise audits in organizations. Together, these findings demonstrate that the absence of noise as a feature of algorithmic advice is not generally desirable, suggesting critical implications for how human-algorithm systems are designed, regulated, and evaluated.
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0339273 (text/html)
https://journals.plos.org/plosone/article/file?id= ... 39273&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pone00:0339273
DOI: 10.1371/journal.pone.0339273
Access Statistics for this article
More articles in PLOS ONE from Public Library of Science
Bibliographic data for series maintained by plosone ().