WASHINGTON , March saw claims about the dangers of coronavirus vaccines spreading across social media. This undermined efforts to stop the virus' spread. Some Facebook employees believed they had found a solution.
Researchers at the company discovered that they could subtly change the ranking of vaccine posts in people's newsfeeds to reduce misleading information about COVID-19 vaccines. They also offered users legitimate sources such as the World Health Organization.
"Given these findings, I'm assuming that we're hoping for launch ASAP," a Facebook employee replied in March to an internal memo regarding the study.
Facebook instead retracted some of the suggestions from the study. Other changes were not made until April.
Another Facebook researcher suggested that comments on vaccine posts be disabled in March so that the platform could better tackle anti-vaccine messages. This suggestion was rejected at the time.
Critics claim that Facebook took too long to react because it was worried about the company's profit margins.
"Why would comment removal be a problem?" Engagement is what matters most," stated Imran Ahmed, CEO of the Center for Countering Digital Hate. It drives attention, and attention equals eyeballs, and eyeballs equals ad revenue.
Facebook stated in an email that it had made "considerable progress this year" with the downgrading of vaccine misinformation in its users' feeds.
Facebook's internal discussions were revealed in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen's legal counsel. The Congress received the redacted versions from a group of news organizations including The Associated Press.
This trove of documents shows Facebook was diligently investigating how it spread misinformation about life-saving vaccinations during the COVID-19 pandemic. The documents also show that rank-and-file employees often suggested solutions to counter anti-vaccine misinformation. The Wall Street Journal published a report last month about Facebook's attempts to address antivaccine comments.
This inaction raises questions as to whether Facebook values controversy and division more than the health of its users.
Roger McNamee, a Silicon Valley venture capitalist who was an early investor in Facebook, said that "these people are selling fear" and is now a vocal critic. It is not an accident. It's a business model.
Facebook ranks posts based on engagement. This is the number of comments, likes and reshares. This ranking system may be useful for simple topics like dog photos, recipes, and the latest viral singalong. Facebook documents reveal that engagement-based ranking does not emphasize polarization, disagreement, and doubt when it comes to contentious, divisive issues such as vaccines.
Facebook researchers modified the ranking system for over 6,000 users from the U.S. and Mexico to decrease vaccine misinformation. These users were not shown posts about vaccines that were selected based on engagement. Instead, they saw posts chosen for their trustworthiness.
These results were quite striking: a decrease of almost 12% in claims made by fact-checkers, and an 8% rise in content from authoritative public healthcare organizations like the WHO or U.S. Centers for Disease Control.
According to internal exchanges, employees at the company responded with exuberance.
Facebook claimed it had implemented many of the findings of the study, but it took another month to do so. This delay occurred at an important stage in the global vaccine rollout.
Dani Lever, a company spokesperson, stated that the internal documents don't reflect the significant progress made since then in promoting reliable information on COVID-19 and expanding policies to eliminate more harmful COVID/vaccine misinformation."
According to the company, it also took some time to review and implement the changes.
But the urgency to act was clearer than ever: States across the U.S. were at that time offering vaccines to the most vulnerable, the elderly and the sick. Public health officials were also concerned. Only 10% of the population had ever received a COVID-19 vaccination. According to a poll by The Associated Press-NORC Center for Public Affairs Research, a third of Americans thought they might skip the shot altogether.
Facebook employees admitted that they didn't know how anti-vaccine sentiment was among the comments section of posts. However, company research from February revealed that up to 60% of comments on vaccine posts were antivaccine or vaccine-reluctant.
Worse, employees of the company admitted that they did not have a system in place for catching such comments or a policy to take them down.
Another internal memo, posted March 2, stated that "Our ability detect (vaccine hesitation) in comments was bad in English -- but basically non-existent elsewhere."
Derek Beres is a Los Angeles resident and author. He promotes immunizations via his Instagram account, which is owned Facebook. After seeing conspiracy theories surrounding COVID-19, and vaccines swirling around on the social media accounts of health and well-being influencers, Beres started hosting a podcast last year.
Beres posted an earlier photo of himself getting the COVID-19 shot. Some on social media said that he would likely die in six months.
Beres stated, "The comments section has been a disaster for so many people."
Some Facebook employees suggested disabling all comments on vaccine posts while the company searched for a solution.
One Facebook employee wrote that he was interested in the proposal to remove ALL comments in-line for vaccine posts, as a temporary solution until we can detect vaccine hesitancy sufficiently in comments to refine our removal.
Lever stated that the company had stopped showing previews for popular comments about vaccine posts until mid-April.
Mark Zuckerberg, Facebook's CEO, announced that Facebook would begin labeling posts that describe vaccines as safe on March 15.
Ahmed, of the Center for Countering Digital Hate, stated that Facebook was able to profit from anti-vaccine comments and continue to engage with them.
Ahmed stated that Facebook had made decisions that have resulted in people being misinformed and causing them to die. "A murder investigation should be initiated at this point."