TikTok and Bumble Join Facebook in Fight to Stop Spread of Leaked Nudes, ‘Revenge Porn’

When Facebook asked Australians in 2017 to submit their nude photos to the platform to help it develop a tool to combat revenge porn, the announcement drew a lot of snarky media commentary.

The idea was to help those who had shared intimate photos or videos with someone else, only to have that person threaten to share them online. Participants in the trial would let the social media company convert the images into a digital fingerprint called a hash. Facebook would then use hash-matching technology to search for and block any attempts to upload the original images to the platform.

When I wrote about it at the time, I was struck by the number of women — the overwhelming share of victims in this type of situation — who contacted me, desperate to know if and when Facebook would launch this pilot feature in their countries. The process certainly wasn’t perfect, as it required vulnerable people to send their photos directly to strangers: human content moderators. However, it allowed victims of a horrible form of harassment and sometimes extortion to claw back a tiny bit of control in an otherwise desperate situation.

Fast-forward to Dec. 1, 2022. Meta Platforms Inc.’s Facebook and Instagram have been running a refined version of the Australian pilot globally for a year. The tech giant developed the tool in partnership with SWGfL, the UK-based nonprofit behind the Revenge Porn Helpline. It’s found at StopNCII.org (Stop Non-Consensual Intimate Image Abuse). 

The effort has helped more than 12,000 people to hash more than 40,000 photos and videos.  Those hashes get shared with Meta, which in turn stops anyone from trying to upload the originals across its social networks. 

Now TikTok, the world’s fastest growing social network with more than a billion mostly young users, as well as dating app Bumble Inc., are getting involved. As of today, they will also block any images included in StopNCII.org’s bank of hashes. 

Motivated harassers will often try to upload an ex’s intimate images to several different platforms in an effort to humiliate, manipulate or extort them. And the most damaging places can be those where a victim’s friends, co-workers, potential partners and family members are already connected. That’s why it’s about time that TikTok has joined up with the project Meta launched.

The move is a way for TikTok, the social media platform du jour for Gen Z, to show it’s taking steps to deal with online harms at a time when it’s under increasing regulatory scrutiny. Its approach to age verification in particular has come under fire in a Bloomberg Businessweek report on the deaths of kids under age 13 after playing a choking game that went viral on the platform.

And, the move comes just as the platform, owned by Beijing-based ByteDance Ltd., needs to contend with changes to British tech law. The UK is preparing to introduce new rules under the Online Safety Bill to require platforms that host user-generated content to prioritize the removal of revenge porn or face stiff penalties. This echoes similar legislation in Australia, where social media sites are required to take down reported cases within 24 hours

While many countries have outlawed the act of sharing image-based abuse, the approach the UK and Australia are taking creates additional liability for social media platforms, which strengthens the business case for removing this material. Other regions, including the European Union and US are closely watching.

The system that StopNCII.org and Meta came up with evolved from the anxiety-provoking practice of sending Facebook one’s nudes. Instead, it provides a central place, run by independent intimate image abuse experts, where victims or potential victims can convert their photos or videos into hashes within their own browser, so the originals don’t leave their device. Only the hashed versions are shared with industry partners. 

It’s still not enough.

“We now have four platforms, but we need thousands,” said SWGfL’s chief executive officer, David Wright. “The more we can get ingesting the hashes, the more we can reduce the threat and fear victims experience.”

StopNCII.org is only effective on platforms that want to crack down on this problem. It can’t help with sites dedicated to sharing nonconsensual images, which in some cases let disgruntled ex-partners create elaborate, search engine-optimized profiles that include the victim’s name and personal information. Google has a separate tool that lets victims request that links to these profiles be removed from search results. But victims and advocacy groups would prefer a more coordinated approach between the major internet companies, so tackling revenge porn didn’t feel like a game of whack-a-mole.

Source: Bloomberg