Meta and ByteDance Agree to Let EU Assess Child Protection on Instagram and TikTok

Published on February 11, 2026 | Translated from Spanish
Logos of the social networks Instagram and TikTok overlaid on a blue background of the European Union, symbolizing regulatory oversight.

Meta and ByteDance Agree to Let the EU Assess Child Protection on Instagram and TikTok

The companies behind Instagram and TikTok have agreed to allow the European Commission to examine their systems for protecting young people. This voluntary move comes under the umbrella of the EU's new Digital Services Act (DSA), which requires big tech to manage the risks their services may generate. 🛡️

The Scope of the Review and What's at Stake

External auditors, selected independently, will be responsible for analyzing how these platforms design their features to safeguard the privacy and mental health of adolescents, and to prevent access to potentially harmful content. The findings of this review are crucial.

Possible Outcomes of the Audit:
  • If flaws are detected, the European Commission could open a formal infringement procedure.
  • This process could lead to very high economic sanctions if the companies do not comply with the regulations.
  • The measure aims to force tangible changes in the design and operation of social networks.
The platforms must demonstrate that their algorithms do not prioritize engagement over the well-being of adolescents.

A Firmer European Regulatory Environment

This action reflects the increasingly strict stance of European regulators toward tech giants. The focus is particularly on how their recommendation algorithms and design decisions affect the youngest users. Meta and ByteDance's voluntary acceptance comes before a possible official designation.

Context of the "Very Large" Designation:
  • Being designated as "very large online services" under the DSA entails permanent and more rigorous compliance obligations.
  • This evaluation can be seen as a step prior to such formal designation.
  • The ultimate goal is to create a safer digital environment for minors.

The Challenge for the Business Model

The fundamental challenge for Instagram and TikTok will be to prove that their systems are not optimized to maximize screen time at the expense of mental health. This could lead to rethinking core elements of their engagement, such as "likes", if regulators determine they contribute to addictive use. The outcome of this evaluation could set a significant precedent for the entire industry. ⚖️