Emerging Regulations on Content Moderation and Misinformation Policies of Online Media Platforms: Accommodating the Duty of Care into Intermediary Liability Models
Caio C. V. Machado and
Thaís Helena Aguiar
Business and Human Rights Journal, 2023, vol. 8, issue 2, 244-251
Abstract:
Disinformation, hate speech and political polarization are evident problems of the growing relevance of information and communication technologies (ICTs) in current societies. To address these issues, decision-makers and regulators worldwide discuss the role of digital platforms in content moderation and in curtailing harmful content produced by third parties. However, intermediary liability rules require a balance that avoids the risks arising from the circulation at scale of harmful content and the risks of censorship if excessive burdens force content providers to adopt a risk-averse posture in content moderation. This piece examines the trend of altering intermediary liability models to include ‘duty of care’ provisions, describing three models in Europe, North America and South America. We discuss how these models are being modified to include greater monitoring and takedown burdens on internet content providers. We conclude with a word of caution regarding this balance between censorship and freedom of expression.
Date: 2023
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.cambridge.org/core/product/identifier/ ... type/journal_article link to article abstract page (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:cup:buhurj:v:8:y:2023:i:2:p:244-251_8
Access Statistics for this article
More articles in Business and Human Rights Journal from Cambridge University Press Cambridge University Press, UPH, Shaftesbury Road, Cambridge CB2 8BS UK.
Bibliographic data for series maintained by Kirk Stebbing ().