Get all the latest news on coronavirus and more delivered daily to your inbox. Sign up here.
Facebook announced the latest effort to clamp down on the spread of misinformation about the coronavirus pandemic, saying it will warn users if they have “liked, reacted or commented” on content that has been deemed “harmful” and removed by the tech giant.
“These messages will connect people to COVID-19 myths debunked by the [World Health Organization] including ones we’ve removed from our platform for leading to imminent physical harm,” Guy Rosen, Facebook’s vice president of integrity, wrote in a blog post. “We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook.”
Rosen added Facebook has already “directed” over 2 billion people to resources from WHO and other health officials from its COVID-19 Information Center and pop-ups on Facebook and Instagram.
Rosen added the features will start to roll out in the next several weeks.
This April 25, 2019, photo shows the thumbs up Like logo on a sign at Facebook headquarters in Menlo Park, Calif. (AP Photo/Jeff Chiu)
Rosen explained that after a piece of content is rated “false” by its fact-checkers, its distribution is reduced and there are warning labels associated with it. He said that in March, 40 million COVID-19-related posts had warnings on them, based on 4,000 articles reviewed by the company’s independent fact-checkers.
“When people saw those warning labels, 95 [percent] of the time they did not go on to view the original content,” Rosen continued. “To date, we’ve also removed hundreds of thousands of pieces of misinformation that could lead to imminent physical harm. Examples of misinformation we’ve removed include harmful claims like drinking bleach cures the virus and theories like physical distancing is ineffective in preventing the disease from spreading.”
The move is just the latest by the Mark Zuckerberg-led company as it reconfigures its platform in the wake of the pandemic.
In March, Facebook, Instagram and Twitter deleted social media posts from Brazilian President Jair Bolsonaro after the platforms deemed they were spreading misinformation regarding COVID-19.
Separately in March, Facebook announced tips on how to spot fake news to its more than 2 billion users.
Earlier this month, Facebook said it would start to ask some of its U.S.-based users about their health in an effort to give researchers more information about self-reported COVID-19 patients.
As of Thursday morning, more than 2 million coronavirus cases have been diagnosed worldwide, more than 639,000 of which are in the U.S., the most impacted country on the planet.