SEP 16, 2023 3:00 PM PDT

Facebook Policies Didn't Control the Spread of COVID-19 Misinformation. Here's Why

WRITTEN BY: Ryan Vingum

To date, the COVID-19 pandemic and COVID-19 infection has been responsible for the deaths of nearly 7 million people across the globe. Amid this devastating pandemic, one behavior that became particularly troublesome was the spread, both intentionally and benignly, of misinformation. This includes the spread of information about the virus itself and where it started and, when it became available, the vaccine itself.

Social media platforms, in particular, were looked at with a high degree of scrutiny for their role in spreading misinformation. Indeed, social media platforms have a unique ability to circulate content, particularly because of the algorithms they use to shape what social media users see (or don’t see). In many cases, these algorithms were blamed for contributing to the spread of misinformation. But according to a new study conducted by researchers at George Washington University, the design and architecture of a social media platform may actually be the reason that misinformation was able to spread, particularly on sites like Facebook. The team’s research is published in a recent article published in Science Advances.

Specifically, researchers found that modifying algorithms or banning/removing content altogether wasn’t entire effective because of how Facebook was designed: to promote community and connection with others. Because of this underlying design, like-minded people can form communities and networks that preserve despite changes to prevent the spread of certain types of content. In the case of COVID-19 misinformation, there was even an increase in sharing this type of information, despite Facebook’s best efforts to prevent the spread to begin with. Facebook’s entire architecture, in fact, is built around this type of community building, including fan page features that can enable even small groups of people to wield significant influence to push certain types of content.

These community features led to a number of challenges, including the removal of content that would be deemed “pro-vaccine,” for example, because of certain content monitoring algorithms. And, researchers also found that people who pushed misinformation made more effective and strategic using of Facebook’s architecture, doing a better job of delivering content across a range of pages and groups.

Sources: Eurekalert!; WHO; Science Advances

About the Author
Master's (MA/MS/Other)
Science writer and editor, with a focus on simplifying complex information about health, medicine, technology, and clinical drug development for a general audience.
You May Also Like
Loading Comments...