EconPapers    
Economics at your fingertips  
 

Public attitudes towards algorithmic personalization and use of personal data online: evidence from Germany, Great Britain, and the United States

Anastasia Kozyreva (), Philipp Lorenz-Spreen, Ralph Hertwig, Stephan Lewandowsky and Stefan M. Herzog
Additional contact information
Anastasia Kozyreva: Center for Adaptive Rationality, Max Planck Institute for Human Development
Philipp Lorenz-Spreen: Center for Adaptive Rationality, Max Planck Institute for Human Development
Ralph Hertwig: Center for Adaptive Rationality, Max Planck Institute for Human Development
Stephan Lewandowsky: School of Psychological Science, University of Bristol
Stefan M. Herzog: Center for Adaptive Rationality, Max Planck Institute for Human Development

Palgrave Communications, 2021, vol. 8, issue 1, 1-11

Abstract: Abstract People rely on data-driven AI technologies nearly every time they go online, whether they are shopping, scrolling through news feeds, or looking for entertainment. Yet despite their ubiquity, personalization algorithms and the associated large-scale collection of personal data have largely escaped public scrutiny. Policy makers who wish to introduce regulations that respect people’s attitudes towards privacy and algorithmic personalization on the Internet would greatly benefit from knowing how people perceive personalization and personal data collection. To contribute to an empirical foundation for this knowledge, we surveyed public attitudes towards key aspects of algorithmic personalization and people’s data privacy concerns and behavior using representative online samples in Germany (N = 1065), Great Britain (N = 1092), and the United States (N = 1059). Our findings show that people object to the collection and use of sensitive personal information and to the personalization of political campaigning and, in Germany and Great Britain, to the personalization of news sources. Encouragingly, attitudes are independent of political preferences: People across the political spectrum share the same concerns about their data privacy and show similar levels of acceptance regarding personalized digital services and the use of private data for personalization. We also found an acceptability gap: People are more accepting of personalized services than of the collection of personal data and information required for these services. A large majority of respondents rated, on average, personalized services as more acceptable than the collection of personal information or data. The acceptability gap can be observed at both the aggregate and the individual level. Across countries, between 64% and 75% of respondents showed an acceptability gap. Our findings suggest a need for transparent algorithmic personalization that minimizes use of personal data, respects people’s preferences on personalization, is easy to adjust, and does not extend to political advertising.

Date: 2021
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (5)

Downloads: (external link)
http://link.springer.com/10.1057/s41599-021-00787-w Abstract (text/html)
Access to full text is restricted to subscribers.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:pal:palcom:v:8:y:2021:i:1:d:10.1057_s41599-021-00787-w

Ordering information: This journal article can be ordered from
https://www.nature.com/palcomms/about

DOI: 10.1057/s41599-021-00787-w

Access Statistics for this article

More articles in Palgrave Communications from Palgrave Macmillan
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-03-23
Handle: RePEc:pal:palcom:v:8:y:2021:i:1:d:10.1057_s41599-021-00787-w