A new study has claimed that Meta's implementation of 63 temporary "break glass" Facebook algorithm changes between Nov. 2020 and Mar. 2021 affected conclusions published within the academic journal Science in 2023 that the social media platform did not contribute to polarization.
Last year's paper, published in July 2023 by Guess et al. in a funded collaboration with Meta, concluded from data taken between September to December 2020 that Meta's machine-learning algorithm "did not cause detectable changes" in polarization and showed less untrustworthy news content than a reverse chronological feed.
However, Bagchi et al. claims that temporary algorithm changes intended to "diminish the spread of voter-fraud misinformation" following the 2020 US Presidential Election affected the "validity and conclusion" of the study by altering the experiment's control condition.