From Audience to Evaluator: When Visibility into Prior Evaluations Leads to Convergence or Divergence in Subsequent Evaluations Among Professionals
Tristan L. Botelho ()
Additional contact information
Tristan L. Botelho: Yale University, New Haven, Connecticut 06520
Organization Science, 2024, vol. 35, issue 5, 1682-1703
Abstract:
Collective evaluation processes, which offer individuals an opportunity to assess quality, have transcended mainstream sectors (e.g., books, restaurants) to permeate professional contexts from within and across organizations to the gig economy. This paper introduces a theoretical framework to understand how evaluators’ visibility into prior evaluations influences the subsequent evaluation process: the likelihood of evaluating at all and the value of the evaluations that end up being submitted. Central to this discussion are the conditions under which evaluations converge—are more similar to prior evaluations—or diverge—are less similar—as well as the mechanisms driving observed outcomes. Using a quasinatural experiment on a platform where investment professionals submit and evaluate investment recommendations, I compare evaluations that are made with and without the possibility of prior ratings influencing the subsequent evaluation process. I find that when prior ratings are visible, convergence occurs. The visibility of prior evaluations decreases the likelihood that a subsequent evaluation occurs by about 50%, and subsequent evaluations become 54%–63% closer to the visible rating. Further analysis suggests that peer deference is a dominant mechanism driving convergence, and only professionals with specialized expertise resist peer deference. Notably, there is no evidence that initial ratings are related to long-term performance. Thus, in this context, convergence distorts the available quality signal for a recommendation. These findings underscore how the structure of evaluation processes can perpetuate initial stratification, even among professionals with baseline levels of expertise.
Keywords: evaluations; inequality; influence; professionals; stratification; ratings; digital platforms (search for similar items in EconPapers)
Date: 2024
References: Add references at CitEc
Citations:
Downloads: (external link)
http://dx.doi.org/10.1287/orsc.2017.11285 (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:inm:ororsc:v:35:y:2024:i:5:p:1682-1703
Access Statistics for this article
More articles in Organization Science from INFORMS Contact information at EDIRC.
Bibliographic data for series maintained by Chris Asher ().