Meta is suing a Chinese app maker that uses artificial intelligence to take images of clothed people and turn them into nudes.
“CrushAI” — the company behind the app used to make the deepfake nudes — is operated by Joy Timeline HK Limited. Meta filed a lawsuit against the company in Hong Kong to ban it from advertising its services on Meta platforms, CBS News reports.
“This legal action underscores both the seriousness with which we take this abuse and our commitment to doing all we can to protect our community from it,” Meta said in a statement. “We’ll continue to take necessary steps — which could include legal action — against those who abuse our platforms like this.”
According to the lawsuit, Joy Timeline made “multiple attempts” to try to get around Meta’s ad review process.
Joy Timeline’s app isn’t the first app of its kind and previous apps that promise to make clothed photos into nudes have actually managed to bypass ad filters on major social media platforms — including Meta — in order to hawk their software.
The company said that the “nudify” apps have devised various ways of skirting past the ad filter, including by using inoffensive imagery to try to fly under the radar.
“We’ve worked with external experts and our own specialist teams to expand the list of safety-related terms, phrases and emojis that our systems are trained to detect with these ads,” Meta said in a statement.
Alexios Mantzarlis, the author of the Faked Up blog, told the BBC there had been “at least 10,000 ads” promoting nudify apps on Meta’s Facebook and Instagram platforms.
“Even as [Meta] was making this announcement, I was able to find a dozen ads by CrushAI live on the platform and a hundred more from other ‘nudifiers’,” he told the broadcaster. “This abuse vector requires continued monitoring from researchers and the media to keep platforms accountable and curtail the reach of these noxious tools.”
The threat of the software is that anyone could feasibly take a photo and, without the photo subject’s consent, turn it into a fake nude.
Meta said that it bans “non-consensual intimate imagery” on its platforms, and previously told CBS News that it removes any ads on its platforms for “nudify” apps.
On Thursday, Meta said it would work with the Tech Coalition’s Lantern Program — aimed at tracking sites that break child safety rules — to share information with other tech companies about apps, sites, or companies that violate its policies.