September 18, 2023
Abstract
An enormous body of academic and journalistic work argues that opaque recommendation algorithms contribute to political polarization by promoting increasingly extreme content. We present evidence that challenges this dominant view, drawing on three large-scale, multi-wave experiments with a combined N of 7,851 human users, consistently showing that extremizing algorithmic recommendations has limited effects on opinions. Our experiments employ a custom-built video platform with a naturalistic, YouTube-like interface that presents real videos and recommendations drawn directly from YouTube. We experimentally manipulate YouTube’s actual recommendation algorithm to create ideologically balanced and slanted variations. Our design allows us to directly intervene in a cyclical feedback loop that has long confounded the study of algorithmic polarization—the complex interplay between algorithmic supply of content recommendations and user demand for its consumption—to examine the downstream effects of recommendation-consumption cycles on policy attitudes. We use data on over 125,000 experimentally manipulated recommendations and 26,000 platform interactions to estimate how recommendation algorithms alter users’ media consumption decisions and, indirectly, their political attitudes. Our work builds on recent observational studies showing that algorithm-driven “rabbit holes” of recommendations may be less prevalent than previously thought. We provide new experimental evidence casting further doubt on widely circulating theories of algorithmic polarization, showing that even large perturbations of real-world recommendation systems that substantially modify consumption patterns have limited causal effects on policy attitudes. Our methodology, which captures and modifies the output of real-world recommendation algorithms, offers a path forward for future investigations of black-box artificial intelligence systems. However, our findings also reveal practical limits to effect sizes that are feasibly detectable in academic experiments.
Citation
Liu, Naijia, Matthew A. Baum, Adam J. Berinsky, Allison J.B. Chaney, Justin de Benedictis-Kessner, Andy Guess, Dean Knox, Christopher Lucas, Rachel Mariman, and Brandon M. Stewart. "Algorithmic recommendations have limited effects on polarization: A naturalistic experiment on YouTube." September 18, 2023.