Algorithmic discrimination, the role of GPS, and the limited scope of EU non-discrimination law
Elena Gramano and
Miriam Kullmann
Chapter 4 in A Research Agenda for the Gig Economy and Society, 2022, pp 53-72 from Edward Elgar Publishing
Abstract:
This chapter investigates the potential discriminatory outcome of algorithmic decision-making and the effectiveness and suitability of the current legal framework in preventing and sanctioning all discrimination perpetrated through algorithms. We build a research agenda by drawing on the concrete implications and issues that stem from the abovementioned cases, which happen to be the first court decisions on this matter. Adopting a practical approach by analysing how the two courts use the existing legal sources could, we believe, reframe the debate and call attention to this matter's most problematic aspects. Our focus and thus our core argument is that algorithms can bring about direct, or more often indirect, discrimination. In doing so, we will assess the decision-making process on the basis of neutral factors, which does not attribute any significance to workers' personal aspects or conditions, at least in theory. Moreover, GPS (Global Positioning System), as an instrument used by businesses, including platforms, to gather data unrelated or not directly relating to work performance, can be used to discriminate or to produce discriminatory effects. In doing so, we focus on how GPS is being repurposed by platform companies that offer on-location services and to what extent GPS can be covered by EU non-discrimination law. It will become increasingly clear that, even if GPS were brought within the scope of one or more of the protected grounds, as the data collected by GPS can be regarded as 'proxy discrimination' for race or ethnic origin, age, or even gender, we face a second challenge, i.e., the substantive scope of most EU non-discrimination laws. This shows the perils of the gig economy can be: workers, as defined by EU case law, are protected by EU law against discrimination; the self-employed, or contractors, as platform companies often classify their workforce, are not or not entirely. The chapter ends with a discussion rooted in the premise that algorithms do not differentiate between different contractual underpinnings and thus are 'blind' to some extent: we discuss whether EU non-discrimination law needs broadening to protect a larger group of platform workers, especially when distinguishing between self-employed and employee platform workers is not possible on the face of it.
Keywords: Business and Management; Development Studies; Economics and Finance; Law - Academic; Politics and Public Policy Sociology and Social Policy (search for similar items in EconPapers)
Date: 2022
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.elgaronline.com/view/edcoll/9781800883512/9781800883512.00011.xml (application/pdf)
Our link check indicates that this URL is bad, the error code is: 404 Not Found
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:elg:eechap:20577_4
Ordering information: This item can be ordered from
http://www.e-elgar.com
Access Statistics for this chapter
More chapters in Chapters from Edward Elgar Publishing
Bibliographic data for series maintained by Darrel McCalla ().