Promise Into Practice: Application of Computer Vision in Empirical Research on Social Distancing
Wim Bernasco,
Evelien M. Hoeben,
Dennis Koelma,
Lasse Suonperä Liebst,
Josephine Thomas,
Joska Appelman,
Cees G. M. Snoek and
Marie Rosenkrantz Lindegaard
Sociological Methods & Research, 2023, vol. 52, issue 3, 1239-1287
Abstract:
Social scientists increasingly use video data, but large-scale analysis of its content is often constrained by scarce manual coding resources. Upscaling may be possible with the application of automated coding procedures, which are being developed in the field of computer vision. Here, we introduce computer vision to social scientists, review the state-of-the-art in relevant subfields, and provide a working example of how computer vision can be applied in empirical sociological work. Our application involves defining a ground truth by human coders, developing an algorithm for automated coding, testing the performance of the algorithm against the ground truth, and running the algorithm on a large-scale dataset of CCTV images. The working example concerns monitoring social distancing behavior in public space over more than a year of the COVID-19 pandemic. Finally, we discuss prospects for the use of computer vision in empirical social science research and address technical and ethical challenges.
Keywords: Computer vision; video data analysis; deep learning; pedestrian detection; social distancing (search for similar items in EconPapers)
Date: 2023
References: Add references at CitEc
Citations:
Downloads: (external link)
https://journals.sagepub.com/doi/10.1177/00491241221099554 (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:sae:somere:v:52:y:2023:i:3:p:1239-1287
DOI: 10.1177/00491241221099554
Access Statistics for this article
More articles in Sociological Methods & Research
Bibliographic data for series maintained by SAGE Publications ().