Remedies against bias in analytics systems
John Steven Edwards and
Eduardo Rodriguez
Journal of Business Analytics, 2019, vol. 2, issue 1, 74-87
Abstract:
Advances in IT offer the possibility to develop ever more complex predictive and prescriptive systems based on analytics. Organizations are beginning to rely on the outputs from these systems without inspecting them, especially if they are embedded in the organization’s operational systems. This reliance could be misplaced unethical or even illegal if the systems contain bias. Data, algorithms and machine learning methods are all potentially subject to bias. In this article we explain the ways in which bias might arise in analytics systems, present some examples, and give some suggestions as to how to prevent or reduce it. We use a framework inspired by the work of Hammond, Keeney and Raiffa on psychological traps in human decision-making. Each of these traps “translates” into a potential type of bias for an analytics-based system. Fortunately, this means that remedies to reduce bias in human decision-making also translate into potential remedies for algorithmic systems.
Date: 2019
References: Add references at CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://hdl.handle.net/10.1080/2573234X.2019.1633890 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:taf:tjbaxx:v:2:y:2019:i:1:p:74-87
Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/tjba20
DOI: 10.1080/2573234X.2019.1633890
Access Statistics for this article
Journal of Business Analytics is currently edited by Dursan Delen
More articles in Journal of Business Analytics from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().