Facebook announced Thursday its first policy to combat misinformation about vaccines, following in the footsteps of Pinterest and YouTube.

The social network is adopting an approach similar to the one it uses to tackle fake news: The company will not remove incorrect content, but it will aim to reduce the reach of that content by making it harder to find.

“Leading global health organizations, such as the World Health Organization and the U.S. Centers for Disease Control and Prevention, have publicly identified verifiable vaccine hoaxes,” Monika Bickert, Facebook’s vice president for global policy management, said in a statement Thursday. “If these vaccine hoaxes appear on Facebook, we will take action against them.”

Anti-vaccination groups have flourished on Facebook, partly because of the site’s search results and “suggested groups” feature.

The group Stop Mandatory Vaccination, for example, has nearly 159,000 members. Larry Cook, the founder, bragged on a recently deactivated GoFundMe page about an anti-vaccine Facebook video that he said was averaging over 100,000 views per day. “This is how we reach parents!” he wrote. The video has 50,000 shares and has been viewed 2.1 million times.

Under Facebook’s new policy, groups and pages that spread misinformation about vaccines will have lower rankings and won’t be included in recommendations or predictions when users are searching within Facebook, the company said.

Instagram, which is owned by Facebook, will have similar policies. “We won’t show or recommend content that contains misinformation about vaccinations on Instagram Explore or hashtag pages,” Ms. Bickert said in the statement.

Facebook’s new rules come amid measles outbreaks in the United States and abroad, and just days after yet another study demonstrated that the measles vaccine doesn’t cause autism. The idea that vaccines are somehow linked to autism has been widely debunked, but still persists among anti-vaccination activists.

Last month, Representative Adam Schiff, Democrat of California and the chairman of the House Intelligence Committee, wrote a letter to Mark Zuckerberg, the chief executive of Facebook, asking what steps the company was taking to prevent anti-vaccine information from being recommended to users.

Facebook said its artificial intelligence system will search for vaccine misinformation and flag posts and links — including pictures and videos that appear in closed groups — that will then be reviewed by someone at the company.


The entrance of Facebook’s corporate headquarters in Menlo Park, Calif.CreditJosh Edelson/Agence France-Presse — Getty Images

If the content is found to contain false claims about vaccines, then posts from the violating groups or pages will appear lower in a user’s news feed, the company said. But members of Facebook groups that promote anti-vaccination content will still see the posts on the group’s page. The company said it is working on ways to warn new or existing group members if a group has shared vaccine misinformation.

Anti-vaccine groups will become “craftier” as moderation techniques develop, said Joan Donovan, the director of the Technology and Social Change Research Project at the Shorenstein Center at Harvard Kennedy School.

Sometimes anti-vaccine propaganda appears on old, abandoned Facebook accounts, for example. On Thursday afternoon, the Facebook page Occupy Philly showed two recent posts warning about the dangers of vaccination.

Anti-vaccination groups can also harness search-engine optimization “by using very specific key words, especially the prescription names of some of these vaccines,” Dr. Donovan said, adding that anti-vaccination groups will also spread out into “momversation groups,” where parents gather online.

Screen shots and other images containing written messages can also help posters hide from tech-based moderation, said Dr. Donovan, who researches disinformation and media manipulation.

While Facebook said its artificial intelligence system can decipher text that has been added to photos, the company said it will not be targeting every single post about vaccines, and is focusing instead on specific claims about vaccines that have been disproved.

The company is also aiming to crack down on advertising that includes misinformation about vaccinations. Such ads will be rejected, Facebook said, and the company may disable ad accounts that violate its policies. It has also removed certain vaccine-related targeting options like “vaccine controversies.”

“I’m really pleased that they are recognizing the downstream impact of this kind of misinformation and taking the right steps to balance expression with the recognition that their curation and their suggestions do have an impact on the communities that people join,” said Renée DiResta, the co-founder of Vaccinate California and the director of research at a cybersecurity company. “I think that the decision to stop accepting ad dollars is the right call.”

The World Health Organization identified “vaccine hesitancy” as one of this year’s 10 notable threats to global health. The decision to avoid vaccination can stem from many things: worries about side effects, cost, moral or religious objections, or a lack of knowledge about immunizations.

As anti-vaccine groups have infiltrated social media, companies have been pressured to stem the tide of misinformation.

Last year, Pinterest blocked results associated with certain vaccine-related searches and said last month that it was working with experts to develop a more tailored long-term approach.

YouTube started surfacing more authoritative content in late 2017 for people searching for vaccination-related topics, and its policies prohibit anti-vaccine videos from showing ads. In India, the company has rolled out information panels that fact-check specific claims as another way of combating misinformation, YouTube said on Thursday. The company said the fact-check panels will expand to other countries this year.