Meta sues AI app behind non-consensual sexual images

The company, Joy Timeline HK Limited, is the developer of 'CrushAI,' an app that enables users to create non-consensual sexually explicit content by digitally removing clothing or fabricating nude images of individuals.

author-image
Social Samosa
New Update
XV

Meta has filed a lawsuit against a Hong Kong-based company behind an AI tool that generates fake nude images, in a move to combat the growing misuse of artificial intelligence for intimate image abuse. The company, Joy Timeline HK Limited, is the developer of 'CrushAI,' an app that enables users to create non-consensual sexually explicit content by digitally removing clothing or fabricating nude images of individuals, often using stolen or public photos.

Meta said the app and its promotions repeatedly violated its policies, which prohibit any form of non-consensual intimate imagery, real or AI-generated. Despite removing multiple ads and accounts linked to the app, the developers continued to bypass the platform’s review process and attempted to re-promote the service. The lawsuit, filed in Hong Kong, seeks to prevent Joy Timeline from advertising the app across Meta’s platforms, including Facebook and Instagram.

The case highlights rising concerns around the misuse of AI for sexual exploitation, especially of women and minors. A recent study from the University of Florida noted a surge in websites offering “nudification” tools, many of which feature fake content involving underage individuals. Meta’s move comes amid mounting pressure on tech platforms to implement stronger safeguards as AI tools become more accessible and potentially harmful.

To tackle the issue more broadly, Meta says it is also developing detection technologies to identify misleading ads even when they do not contain explicit content. It has shared thousands of violating URLs with other tech firms through the Tech Coalition’s Lantern program to help ensure cross-platform enforcement. Since early 2025, Meta has dismantled four separate networks that were operating nudify-related ad accounts.

The company also reaffirmed support for emerging legislation like the U.S. 'Take It Down Act,' which aims to outlaw the distribution of AI-generated non-consensual imagery. Meta maintains that while technology enables new forms of creativity, it must not come at the cost of safety and dignity.

This legal step marks one of the first major crackdowns on AI-generated sexual content, as platforms face increasing scrutiny over their role in the spread of digital harm. Whether it will serve as a deterrent remains to be seen, but Meta’s action signals a firmer stance on AI misuse going forward.

Non-consensual intimate imagery CrushAI Joy Timeline HK Meta lawsuit