Versions :<123456789Live
Snapshot 9:Wed, Oct 2, 2024 3:09:37 AM GMT last edited by Adam

Study: Facebook Altered Algorithm During Study on Trustworthy News

Study: Facebook Altered Algorithm During Study on Trustworthy News

Above: The Facebook logo displayed on a smartphone screen in Athens, Greece, on Sept. 23, 2024. Image copyright: Nikolas Kokovlis/Contributor/NurPhoto via Getty Images

The Facts

  • New research claims that 63 temporary Facebook algorithm changes, made by parent company Meta between November 2020 and March 2021, affected a study published in 2023 concerning whether the social media platform encouraged untrustworthy news.

  • The study, conducted jointly between academic researchers and Meta, analyzed data from September to December 2020. It found that Meta's machine-learning algorithm didn't cause noticeable changes in polarization and showed less untrustworthy news content than a reverse chronological feed.

  • At the time, the results of Guess et al.'s study were covered by multiple media outlets including The Washington Post, AP News, and The Atlantic.

The Spin

Social media has played a key part in the rise of polarization, with algorithms and fake news seeking to exploit audiences who continue to spend an increasing amount of time online. Unless there's an immediate and widespread effort to teach key skills to help identify and combat this digital epidemic, social cohesion will continue to crumble in the face of division and hatred.

SocialEveryone mediaknows isthat onemisinformation ofis a multitudeproblem of independentacross factorsthe thatworld canand bewith attributedrespect to todayall ideologies — but that doesn'st tidemean ofwe polarizationshould give rangingtech fromcompanies aor country'sgovernments politicalthe freedompower to thedefine uniquewhat psychologicalis statetrue. ofWhether anyit's giventhe individual.Chinese Whilegovernment-linked socialTikTok mediaapp can'tor bepoliticians deniedin asWestern andemocracies, importantso-called part"defenders" of thedemocracy puzzle,are solvinggoing theto problemuse ofthis anfearmongering increasinglyas divideda worldtrojan involveshorse moreto thanimpose placingrules solein blametheir on online platformsfavor.

EveryoneSocial knowsmedia thathas misinformationplayed is a problemkey part acrossin the worldrise andof polarization, with respectalgorithms toand allfake ideologiesnews seeking butto thatexploit doesn'taudiences meanwho wecontinue shouldto givespend techan companiesincreasing oramount governmentsof thetime power to define what is trueonline. WhetherUnless itthere's thean Chineseimmediate government-linkedand TikTokwidespread appeffort orto politiciansteach inkey Westernskills democracies,to so-calledhelp "defenders"identify ofand democracycombat arethis goingdigital toepidemic, usesocial thiscohesion fearmongeringwill ascontinue ato trojancrumble horsein tothe imposeface rulesof indivision theirand favorhatred.

ThereSocial media is one of a 73%multitude chanceof independent factors that Metacan willbe claimattributed there to havetoday's beentide AI-drivenof "coordinatedpolarization inauthentic behavior"ranging aimedfrom ata influencingcountry's political freedom to the 2024unique USpsychological Presidentialstate Election,of accordingany togiven individual. While social media can't be denied as an important part of the Metaculuspuzzle, predictionsolving communitythe problem of an increasingly divided world involves more than placing sole blame on online platforms.

Everyone knows that misinformation is a problem — across the world and with respect to all ideologies — but that doesn't mean we should give tech companies or governments the power to define what is true. Whether it's the Chinese government-linked TikTok app or politicians in Western democracies, so-called "defenders" of democracy are going to use this fearmongering as a trojan horse to impose rules in their favor.

Metaculus Prediction


The Controversies



Go Deeper


Articles on this story

Sign Up for Our Free Newsletters
Sign Up for Our Free Newsletters

Sign Up!
Sign Up Now!