EU Delays Its AI Act and Bans Sexual Deepfakes

Published on March 27, 2026 | Translated from Spanish

The European Parliament has voted for a significant delay in the implementation of its AI Act, postponing key deadlines until 2027 and 2028. However, in parallel, it has backed the inclusion of an explicit ban on applications that generate fake nudes, known as nudify apps. This measure is a direct response to the social alarm over sexualized deepfakes, such as those recently spread on the X platform. The decision creates a period of regulatory uncertainty for companies, which will now have to wait for the final negotiation of the text with the European Council.

European Parliament with flags, a clock marking 2027 and a computer screen showing a digitally manipulated face.

3D Forensic Analysis and Computer Vision: Keys to Auditing and Compliance 🔍

The future ban on sexualized deepfakes will not be effective without robust technical tools for their detection and verification. This is where 3D forensic analysis and computer vision technologies become crucial. Manipulation detection algorithms look for inconsistencies in lighting, projected shadows, 3D facial geometry, and skin textures that AI generators still do not replicate perfectly. To enforce the law, platforms and authorities will need to implement automated systems that analyze metadata, search for anomalous compression artifacts, and compare the physical coherence of a scene. Digital auditing will depend on this technical capability to identify synthetic content, attribute it, and protect victims.

A Legal Framework that Depends on Technical Evolution ⚖️

The delay in the AI Act, although it generates uncertainty, could be an opportunity to better align the legal framework with the pace of technology. Forensic analysis tools must evolve alongside deepfake generators, in a constant technical race. The future regulation must be flexible enough to incorporate new detection methods, without hindering legitimate innovation in 3D graphics. The ban is a necessary step, but its real success will be measured by the technical capacity to sustain it.

How will the delay in the EU AI Act affect the ability of companies and authorities to audit and combat sexual deepfakes in the coming years?

(PS: Detecting deepfakes is like playing Where's Wally? but with suspicious pixels.)