Interaction on misinformation stories fell down by 65%

Sep 16, 2018 08:51 GMT  ·  By

A not yet peer-reviewed and ongoing study by researchers from New York University, Microsoft Research, and Standford University, says that Facebook's false news problem has drastically declined following the social network's decision to fight it after the 2016 election, according to a Slate report.

The study was designed to measure how misinformation travels across both Facebook and Twitter, using a base of 570 different websites classified by fact-checking organizations as sources of fake news.

The research team took count of the volume of engagements for the Facebook platform and the number of shares for Twitter, carefully comparing them to each other for each month throughout the study.

BuzzSumo, a database of interaction volume for content shared on social networks, is the tool used to make heads and Tails what was happening with the fake news content on Twitter and Facebook for the January 2015 to July 2018 period.

"The fact that Facebook engagements and Twitter shares follow similar trends prior to late 2016 and for the non-fake-news sites in our data, but diverge sharply for fake news sites following the election, suggests that some factor has slowed the relative diffusion of misinformation on Facebook," the study state.

Fake news engagement has dropped by 65% in less than three years

The research team behind this study says that, even though there can't be 100% certainty, the changes made by Facebook in its false news detection algorithms and their policy are the most probable reason.

The results of the study show that, overall, the volume of interactions on false stories shared from misinformation websites dropped drastically from 200 million monthly shares, comments, and likes at the end of 2016, to just 70 million per month in July 2018.

Despite not looking very good when seeing that fake news still manage to raise the interest of tens of millions of people, we should take into account that Facebook managed to lower the engagement for false stories by 65% in less than three years.

On the other hand, even though it looks like Facebook's anti-misinformation campaign is a highly successful one, it still has to do a lot more and work a lot harder until the fake news engagement volume, and the websites that spread false stories on the platform will consider doing it not worthwhile.

Facebook has just announced that beginning with September 14th, 2018, their fact-checking effort has expanded to videos and photos, with optical character recognition (OCR) and machine learning as some of the extra false news checking tools added to vet multimedia content on the social network.

Photo Gallery (2 Images)

Fake news.
Facebook vs Twitter engagement trends comparison
Open gallery