Ƶ

Social Media Giant's Efforts to Stymie Antivax Content Had Mixed Results

— Facebook aimed to tackle misinformation, but did it work?

Ƶ MedicalToday
 A photo of the Facebook logo as seen through a droplet hanging off of the tip of a syringe.

While Facebook's policy on removing repeat offenders who post COVID-19 vaccine misinformation may have reduced the number of posts in anti-vaccine pages/groups, it has not led to a sustained reduction in engagement with anti-vaccine content, a study suggested.

Explicitly anti-vaccine pages and groups were 2.13 times (95% CI 1.70-2.66) more likely to be removed than their pro-vaccine counterparts, and anti-vaccine post volumes decreased 1.47 times more than pro-vaccine post volumes (RR 0.68, 95% CI 0.61-0.76), reported David A. Broniatowski, PhD, of George Washington University in Washington, D.C., and colleagues in .

Posts to anti-vaccine groups also decreased relative to pre-policy trends, and anti-vaccine group post volumes decreased 3.57 times more than pro-vaccine group post volumes (RR 0.28, 95% CI 0.21-0.37).

The authors pointed out that pro-vaccine content was also removed, "and anti-vaccine content became more misinformative, more politically polarized, and more likely to be seen in users' newsfeeds."

Co-author Lorien C. Abroms, ScD, MA, also of George Washington University, told Ƶ that platforms like Facebook enacting policies on vaccine misinformation "could potentially have a big public health impact."

When Facebook announced this policy in late 2020, Abroms noted that "we thought this was a wonderful opportunity to evaluate: could they rein in their misinformation as they intended to do?"

She said that they "saw that while the sheer number of posts went down, engagement with misinformation not only rebounded but went to a level that was higher than its engagement level prior to the policy announcement."

Post-policy, there were no significant changes in engagement with content on anti-vaccine pages (RR 0.73, 95% CI 0.28-1.90). In fact, anti-vaccine Facebook groups grew in engagement, being on average 33% higher than what would be expected on the basis of pre-policy trends (RR 1.33, 95% CI 1.05-1.69), although this increase was not significant when compared with pro-vaccine group trends (RR 1.22, 95% CI 0.94-1.56).

Of the misinformative posts, the largest increase were in those alleging severe adverse reactions to the COVID-19 vaccine (OR 1.41, 95% CI 1.05-1.90). Reports of hospitalizations and deaths (OR 1.23, 95% CI 1.09-1.38) also increased, as did posts promoting alternative medicine (OR 1.32, 95% CI 1.28-1.35).

Posts about alleged negative effects of vaccines on immunity, either due to toxic ingredients (OR 1.24, 95% CI 1.13-1.37), or focused on children (OR 1.34, 95% CI 1.02-1.77), also increased.

There were also increases in posts about schools (OR 2.02, 95% CI 1.65-2.46), other vaccine mandates (OR 1.21, 95% CI 1.16-1.27), legislation opposing vaccination (OR 1.06, 95% CI 1.02-1.13), and anti-vaccine medical advice (OR 1.14, 95% CI 1.09-1.18).

Ultimately, Broniatowski and team concluded that "since engagement levels with anti-vaccine page content did not differ significantly from pre-policy trends, this potentially reflects vaccine-hesitant users' desire for more information regarding a novel vaccine at a time when specific false claims had not yet been explicitly debunked."

Regina Royan, MD, of the University of Michigan in Ann Arbor, told Ƶ that this study's findings illuminate "just how much work still needs to be done, and the increasingly polarized climate in which physicians and public health scientists are trying to work."

Because the study only looked at posts in English, Royan noted that the research wasn't able to capture changing vaccine messaging among other groups.

"There is some evidence that policies and tools designed to identify misinformation are even worse at identifying misinformation in Spanish," she said, adding that future research should analyze the effectiveness of similar policies in posts from other languages.

"Misinformation kills, and there will be more pandemics in our lifetime," Royan noted. "Figuring out how to accurately identify misinformation and prevent its spread is imperative to prevent loss of life during this pandemic and the next."

For this study, Broniatowski and colleagues used CrowdTangle to pull data from public Facebook pages and groups. In December 2020, the social media giant announced it would remove accounts and pages that repeatedly post misinformation about COVID-19 vaccines, which was later extended to vaccine misinformation in general.

The researchers said that they analyzed both pages and groups because they serve different functions: "Only page administrators may post in pages, which are designed for marketing and brand promotion. In contrast, any member may post in groups, which serve as a forum for members to build community and discuss shared interests," they wrote.

The pre-policy search came up with 216 vaccine-related pages (114 anti-vaccine and 102 pro-vaccine) as well as 100 groups (92 anti-vaccine and 8 pro-vaccine); within those were 119,091 posts to pages (73% anti-vaccine) and 168,419 posts to groups (97% anti-vaccine) that were created from Nov. 15, 2019 to Nov. 15, 2020.

There were 177,615 posts to those same pages (62% anti-vaccine) and 244,981 posts to those same groups (95% anti-vaccine) that were created from Nov. 16, 2020 to Feb. 28, 2022. Five percent of public anti-vaccine groups switched to private, which evaded CrowdTangle.

The authors noted several limitations to their study, including that they could only analyze public Facebook pages and groups and therefore their results do not apply to private ones. Additionally, it's possible that CrowdTangle missed some data, particularly from pro-vaccine spaces. Lastly, the tool cannot distinguish between individuals versus page administrators posting.

  • author['full_name']

    Rachael Robertson is a writer on the Ƶ enterprise and investigative team, also covering OB/GYN news. Her print, data, and audio stories have appeared in Everyday Health, Gizmodo, the Bronx Times, and multiple podcasts.

Disclosures

The study was funded in part by the John S. and James L. Knight Foundation through the GW Institute for Data, Democracy, and Politics and grants from the National Science Foundation.

Broniatowski reported receiving consulting fees from Merck & Co. and a speaking honorarium from the United Nations Shot@Life Foundation.

Abroms and other co-authors had no competing interests to disclose.

Royan also had no conflicts of interest.

Primary Source

Science Advances

Broniatowski DA, et al "The efficacy of Facebook's vaccine misinformation policies and architecture during the COVID-19 pandemic" Sci Adv 2023; DOI: 10.1126/sciadv.adh2132.