Facebook and YouTube are suppressing anti-vaccination content by making the divisive material difficult to locate.

Facebook plans on preventing pages and groups that spread misinformation about vaccinations from appearing over people’s Facebook News Feeds and in search.

In addition, the social network is no longer running ads that target people interested in “vaccine controversies,” Facebook Vice-President Monika Bickert wrote in a company post.

She indicated that when ads that include misinformation about vaccinations, are found, they will be rejected.

“For ad accounts that continue to violate our policies, we may take further action, such as disabling the ad account.”

The crackdown will also affect Instagram, which is owned by Facebook. Like Facebook, YouTube has also pulled advertising from videos that feature anti-vaccination conspiracy theories, a move the video-sharing site initially publicized last month.

This crackdown is guaranteed to offend free speech advocates as Facebook and YouTube have had to balance fighting misinformation against charges of censorship.

However, the crackdown isn’t a complete ban on the controversial content. For instance, Facebook made no mention of deleting pages or groups that carry the anti-vaccination content.

Presumably, if users search hard enough, they’ll still be able to find posts by supporters of the anti-vaxxer movement.

Nevertheless, both Facebook and YouTube plan on discouraging people from buying into conspiracies about vaccinations; they plan on attaching anti-vaxxer content with educational information from authoritative medical sources.