Meta has escalated its campaign against so-called “nudify” apps, filing a lawsuit in Hong Kong against Joy Timeline HK Limited, the company behind the AI-powered CrushAI app. The app enables users to generate fake, sexually explicit images of individuals without consent, an activity that violates Meta’s longstanding policies against non-consensual intimate imagery.
Filed on June 12, the lawsuit follows repeated attempts by Joy Timeline to circumvent Meta’s ad review system after its ads were removed from Facebook and Instagram. According to Meta, the legal action seeks to block the company from advertising these apps across its platforms.
Beyond legal avenues, Meta is expanding technical and collaborative enforcement strategies.
New detection strategies
The company has developed new detection technologies capable of identifying nudify-related ads even when explicit content is not visible, using enhanced matching techniques and broader safety term recognition.
Since the beginning of the year, Meta has dismantled four distinct ad networks attempting to promote nudify apps through coordinated inauthentic behaviour. The company is also sharing data, over 3,800 URLs so far, with other tech firms via the Tech Coalition’s Lantern program, which facilitates cross-platform enforcement.
“These actors are financially motivated and constantly shift tactics,” Meta said. “We are evolving our response accordingly.”
A push for new laws
Meta reaffirmed its support for broader regulatory action, including the U.S. “Take It Down Act,” aimed at helping individuals remove non-consensual intimate content.
The company also endorsed parental oversight legislation that would allow guardians to monitor and restrict app downloads, including nudify apps, on their children’s devices.