Half of Young People Turn to AI for Emotional Problems

Published on February 04, 2026 | Translated from Spanish
A young person sitting in front of a laptop screen, where a chat interface with artificial intelligence is displayed. The young person's expression is thoughtful and the screen glows with a soft blue tone, symbolizing digital interaction and the search for advice.

Half of Young People Consult AI for Emotional Problems

A recent analysis reveals that one in two teenagers in Italy has sought guidance from an artificial intelligence system to address sentimental dilemmas or psychological disorders. This pattern indicates a transformation in how newer generations seek support, valuing the speed and anonymity provided by these platforms over visiting an expert. The situation sparks an urgent conversation about how useful and far-reaching technology can be in the realm of the mind. 🤖

AI Serves as a Confidential First Contact

Numerous teenagers report using these digital assistants as a starting point to express their emotions or try to understand what is happening to them. The platform does not judge and can be accessed at any time, a factor they consider decisive. However, experts warn that AI lacks genuine empathy and may offer general or, in some cases, incorrect recommendations. Its role should be limited to guiding, never replacing a formal therapeutic process.

Advantages Perceived by Users:
  • Constant availability, without schedules or waiting lists.
  • Prejudice-free environment where the user feels safe to talk.
  • Immediate response that calms initial anxiety.
An algorithm cannot diagnose or understand the depth of an emotional conflict.

The Dangers of Relying Solely on Automated Responses

Relying exclusively on machine-generated solutions for complex human issues entails several dangers. An automated system is unable to assess the true severity of a mood state or the intricate nature of a personal problem. Public health agencies emphasize the urgency of informing about the limitations of these tools. Their utility should be directed toward guiding people to a qualified professional, not creating the false impression that the issue is already resolved. ⚠️

Risks Identified by Experts:
  • Possibility of receiving generic advice that does not fit the specific situation.
  • Lack of follow-up and professional evaluation of progress or worsening.
  • Creation of dependency on a tool that does not establish a real human connection.

A New Landscape for Emotional Support

It seems that the ultimate source of advice for the heart is no longer a close friend, but a cloud server that, between interactions, also refines its algorithms for other purposes, such as suggesting products. This phenomenon underscores the need for an ethical and educational framework that teaches how to use technology as a complement, never as a substitute, for specialized interpersonal support. The challenge lies in integrating these tools without losing sight of the human essence required to heal the mind.