First, Do No Harm: Algorithms, AI, and Digital Product Liability
Marc J. Pfeiffer
Papers from arXiv.org
Abstract:
The ethical imperative for technology should be first, do no harm. But digital innovations like AI and social media increasingly enable societal harms, from bias to misinformation. As these technologies grow ubiquitous, we need solutions to address unintended consequences. This report proposes a model to incentivize developers to prevent foreseeable algorithmic harms. It does this by expanding negligence and product liability laws. Digital product developers would be incentivized to mitigate potential algorithmic risks before deployment to protect themselves and investors. Standards and penalties would be set proportional to harm. Insurers would require harm mitigation during development in order to obtain coverage. This shifts tech ethics from move fast and break things to first, do no harm. Details would need careful refinement between stakeholders to enact reasonable guardrails without stifling innovation. Policy and harm prevention frameworks would likely evolve over time. Similar accountability schemes have helped address workplace, environmental, and product safety. Introducing algorithmic harm negligence liability would acknowledge the real societal costs of unethical tech. The timing is right for reform. This proposal provides a model to steer the digital revolution toward human rights and dignity. Harm prevention must be prioritized over reckless growth. Vigorous liability policies are essential to stop technologists from breaking things
Date: 2023-11
New Economics Papers: this item is included in nep-ain, nep-law and nep-pay
References: Add references at CitEc
Citations:
Downloads: (external link)
http://arxiv.org/pdf/2311.10861 Latest version (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:2311.10861
Access Statistics for this paper
More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().