The EU targets apps that generate sexual deepfakes

Published on May 07, 2026 | Translated from Spanish

The European Union has decided to crack down on nudity apps that use artificial intelligence to create non-consensual sexual deepfakes. The measure, part of a new regulation, aims to protect people's privacy and image against a technology that has become accessible to anyone with a smartphone. Apps that generate these montages without explicit permission will be banned across the entire bloc.

Image shows a smartphone with an AI nudity app, surrounded by EU legal warnings and a broken lock symbolizing privacy.

How detection of these digital montages works 🛡️

The technology behind these deepfakes typically uses generative adversarial networks (GANs) to map a person's face onto a naked body. Moderation systems, on the other hand, employ computer vision models trained on datasets of real and synthetic images to identify anomalies in textures, lighting, and edges. The new law will require platforms to implement automatic filters that block the upload of this content, with financial penalties for those who fail to comply.

The nudity app that will now only serve for boring selfies 😅

It turns out that startups that sold the app to remove clothes with a click will now have to recycle their code for something more useful, like an Instagram filter that puts a trench coat or a crappy raincoat on you. Developers are crying into their keyboards because their promising virtual nudity business has gone down the drain. That said, at least EU lawyers have ensured that no one will ever receive a sexual deepfake from their brother-in-law in the WhatsApp group again.