Facebook Fact-Checkers Spot Less Than 1% of Coronavirus-Related Fake News, Oxford Study Finds

9/22/20 • Tech Round — By Shannon Rawlins

YouTube videos with false information gather more shares on social media than the videos of five leading news broadcasters combined.


Facebook most popular platform for sharing YouTube misinformation videos in comparison with Twitter and Reddit.

Analysis shows failure of Facebook’s content moderation policies, with third party fact-checks catching less than 1% of COVID-19 misinformation videos.

A new study by the Oxford Internet Institute and the Reuters Institute for the Study of Journalism reveals that coronavirus-related misinformation videos are primarily spread through social media and that Facebook is the primary channel for sharing misinformation due to a lack of sufficient fact checks in place to moderate content.

The Oxford study examined over a million YouTube videos about COVID-19 which circulated on social media and identified videos which had eventually been removed by YouTube because they contained false information.

However, the study found that these misinformation videos do not find their audience through YouTube itself, but largely by being shared on Facebook; data analysis revealed that YouTube videos containing coronavirus-related misinformation were shared nearly 20 million times on Facebook between October 2019 and June 2020. They had a higher reach on social media than the five largest English-language news sources on YouTube (CNN, ABC News, BBC, Fox News and Al Jazeera) combined, whose videos were shared 15 million times.

Facebook also generated a higher number of reactions than other social media platforms.  Misinformation videos shared on Facebook generated a total of around 11,000 reactions (likes, comments or shares), before being deleted by YouTube.  In comparison, videos posted on Twitter were retweeted on average around 63 times.

The Oxford researchers also found that out of the 8,105 misinformation videos shared on Facebook between October 2019 and June 2020, only 55 videos had warning labels attached to them by the company’s third-party fact checkers – this is less than 1% of all misinformation videos. This failure of fact-checking helped COVID-related misinformation videos spread on Facebook and find a large audience.

Just 250 Facebook groups are responsible for half of the visibility that misinformation videos acquire among public social media accounts. The most popular misinformation videos often include individuals who claim medical expertise making unscientific claims about treatments for and protection from the coronavirus disease.

Dr Aleksi Knuutila, Postdoctoral Researcher at the Oxford Internet Institute, said:

“People searching for Covid-related information on YouTube will see credible sources, because the company has improved its algorithm. The problem, however, is that misinformation videos will spread by going viral on other platforms, above all Facebook.  The study shows that misinformation videos posted on YouTube found a massive audience, and it is likely to have changed people’s attitudes and behaviour for the worse.  For the sake of public health, platform companies need to ensure the information people receive is accurate and trustworthy.”

Previous
Previous

Responsible Hospitality Network of Leading Hotel Companies Relaunches as the Sustainable Hospitality Alliance

Next
Next

Algorithm Predicts Onset of COVID-19 Symptoms from Data Collected by Wearable Smart Ring