As misinformation flourished online during the COVID-19 pandemic, a number of platforms announced policies and practices aimed at combating the spread of misinformation. Did those efforts work?
Article continues below
Research published in Science Advances suggests that the COVID-19 vaccine misinformation policies of Facebook, the world's largest social media platform, were not effective in combating misinformation. The study is titled "The Efficacy of Facebook's Vaccine Misinformation Policies and Architecture During The COVID-19 Pandemic". Researchers at Johns Hopkins University contributed to the report.
The study, led by researchers at the George Washington University, found that Facebook's efforts were undermined by the core design features of the platform itself.
"There is significant attention given to social media platforms and artificial intelligence governance today. However, this discussion largely focuses on either content or algorithms. To effectively tackle misinformation and other online harms, we need to move beyond content and algorithms to also focus on design and architecture," says David Broniatowski, lead study author and an associate professor of engineering management and systems engineering at GW.
"Our results show that removing content or changing algorithms can be ineffective if it doesn't change what the platform is designed to do enabling community members to connect over common interests in this case, vaccine hesitancy and find information that they are motivated to seek out."
Facebook is designed to build communities around the things people care about. To do so, it uses several different architectural features, including fan pages that promote brands and community celebrities, enabling a relatively small group of influencers to reach large audiences.
These influencers can then form groups that are explicitly designed to build communities where community members can exchange information, including how to access misinformation or other compelling content off of the platform.
These group members, and especially group administrators (who are often page content creators) can then make use of Facebook's newsfeed algorithms to make sure that this information is available to those who care to see it.
The researchers found that, while Facebook expended significant effort to remove a lot of anti vaccine content during the COVID-19 pandemic, overall engagement with anti vaccine content did not decrease beyond prior trends and, in some cases, even increased.
In the content that was not removed, there was an increase in links to off platform, low credibility sites and links to misinformation on "alternative" social media platforms.
In addition, remaining anti vaccine content on Facebook became more not less misinformative, containing sensationalist false claims about vaccine side effects that were often too new to be fact checked in real time.
Furthermore, anti vaccine content producers used the platform more effectively than pro vaccine content producers.
Even when Facebook tweaked its algorithms and removed content and accounts to combat vaccine misinformation, the researchers say the architecture of the platform pushed back. ■