Serving tech enthusiasts for over 25 years.
TechSpot means tech analysis and advice you can trust.
What just happened? Meta's mission to crack down on AI "nudify" programs has led to it suing the maker of one of these apps. The social media giant has launched a lawsuit against Joy Timeline HK Limited, which developed an app called Crush AI.
In the suit, which has been filed in the company's home of Hong Kong, Meta states that Crush AI made multiple attempts to circumvent Meta's ad review process and continued placing the ads.
The ads appeared across Facebook and Instagram, and while Meta repeatedly removed them for breaking the rules, Joy Timeline kept posting more.
One of the many negative consequences to come from the advancement of generative AI has been the rise of nudify apps. They use the technology to generate nonconsensual nude and explicit images of people after being fed photos of the individual.
Crush AI had been one of the most prolific advertisers among these apps. An investigation in January from the author of the Faked Up newsletter, Alexios Mantzarlis, found that Meta's platforms ran more than 8,000 Crush AI-related ads during the first two weeks of the year alone. He notes that roughly 90% of Crush AI's website traffic came from either Facebook or Instagram.
Crush AI avoided Meta's review process by setting up dozens of advertiser accounts and frequently changed domain names. Crush AI also had a Facebook page promoting its service.
Senator Dick Durbin sent a letter to Meta CEO Mark Zuckerberg in February, urging him to address the ads. Durbin wrote that the ads violated Meta's Advertising Standards, including its prohibitions on ads featuring adult nudity, sexual activity, and certain forms of bullying and harassment.
90% of online traffic to a nudify app originates from Meta advertising. It's incredibly problematic and creates new victims of deepfake intimate imagery.
I just told Mark Zuckerberg: it's time to step up.
Meta says that it has now developed new technology that is designed to find and remove these types of nudify ads more quickly. It has also expanded the list of terms, phrases and emoji that are flagged by its systems.
The company is also working with specialist teams to stay up to date with how these app makers evolve their tactics to avoid detection. It will be sharing signals about the apps with other tech companies so they can address them on their respective platforms.
In May last year, Google announced a new policy that prohibits ads for deepfake porn or those that promise to digitally undress people without consent. Soon after, the San Francisco City Attorney's office sued 16 of the most-visited "undressing" sites with the aim of shutting them down.