EU Targets Grok but Ignores AI Nude Apps

Published on January 27, 2026 | Translated from Spanish
Conceptual illustration showing a mobile phone with icons of AI apps that generate nude images, superimposed over the European Union logo, symbolizing regulatory concern.

The EU Focuses on Grok but Ignores AI Nudify Apps

European Union authorities have expressed alarm at the possibility that the Grok artificial intelligence tool could generate images of nude people. However, this stance overlooks a much broader and established technological reality. While discussions to limit access to this editor are underway, numerous applications with identical functions remain available without obstacles on the world's largest distribution platforms. 📱

A Massive Ecosystem in Official Stores

A recent study by the Technology Transparency Project (TTP) provides figures on this phenomenon. The research identifies over a hundred apps in Google Play and Apple App Store specifically designed to digitally remove clothing from people in photos. These tools primarily focus on portraits of women, capable of showing subjects fully nude, partially undressed, or wearing very little clothing.

Key Data from the TTP Report:
  • 55 applications were detected in Google Play and 48 in Apple's App Store.
  • The total of these apps has accumulated more than 705 million downloads globally.
  • They have generated estimated revenues of around 117 million dollars.
The mechanism for creating non-consensual sexualized images is widely distributed and easily accessible.

The Measure Against Grok is a Limited Solution

Proposing to restrict only the Grok image editor seems like an action that touches only a small part of the conflict. The existence of this extensive catalog of specialized apps demonstrates that the problem is structural. Focusing on a single service, while a complete ecosystem operates with impunity and reaches hundreds of millions of users, questions the real effectiveness of the proposed response. 🚨

Challenges for Effective Regulation:
  • The response must directly involve the app stores, demanding that they strengthen their review policies.
  • A comprehensive approach is necessary, not isolated actions against individual tools.
  • There is evident disparity in the application of rules, where some apps pass filters while others are singled out.

A Conclusion on Digital Double Standards

The situation reveals a paradox of the digital environment: it is technologically easier to undress a person in an image than to implement consistent and equitable controls to prevent it. While attention focuses on one actor, dozens operate in plain sight, suggesting that standards sometimes wear double standards. The solution, if it is to be genuine, requires looking beyond the immediate target and acting on the platforms that distribute these tools. ⚖️