Can a Machine Heal a Heart? Why New Age Youth Are Turning to AI for Solace

Samah Qundeel, TwoCircles.net

New Delhi: We once believed that technology would bring people closer. But somewhere along the way, it did the opposite. Today, it is not rare to see someone glued to their screen, preferring virtual interaction over real conversations. The very tools designed to connect us have, in many ways, deepened the loneliness around us, and it shows no signs of going away.


Support TwoCircles

In this growing silence, something unheard of is happening. More and more people are turning to AI chatbots for emotional comfort, seeking warmth in what many describe as their “nonjudgmental and always available” nature.

This shift is especially visible among young adults, many of whom hesitate to seek professional help for mental health. Some avoid it due to stigma, some do not have access to resources and others simply do not know where to begin. For many, it is easier to open a laptop and start typing into a chatbot than to walk into a therapist’s office.

Apps like OpenAI’s ChatGPT, Google’s Gemini and Apple’s Siri are gradually becoming silent companions. They are not therapy bots by design, feel those who find solace in these apps, they offer a sense of relief to people who are disconnected or unheard. They are different from AI tools built specifically for mental health support and often operate under separate guidelines.

A 2023 study published by the NIH National Library of Medicine stated, “Chatbots have great potential to offer social and psychological support in situations where real-world human interaction is not preferred or possible to achieve. However, there are several restrictions and limitations that these chatbots must establish according to the level of service they offer”.

The study also warned about the flip side. It cautions depending too heavily on chatbots can lead to increased isolation and a lack of proper human help during moments of crisis.

When AI Becomes a Listener

For 25-year-old Vaishali Bhat, a digital content creator, ChatGPT has become a tool for self-reflection. “I rely on chatbots such as ChatGPT due to the lack of emotional support in my life. Sometimes, I just prompt mentioning my situation and seek assurance and a kind of validation about my reaction,” she said.

Since moving to a new city for work in 2024, she often finds herself feeling cut off. She admits, though, that the chatbot’s responses do not necessarily make her feel better emotionally. It offers insights, but not healing.

For others like 27-year-old veterinarian Dr. Tabinda Wani, chatbots are a way to avoid judgment. “I often share my feelings with the chatbot and it gives me some tips on managing my emotions. That helps me feel a bit lighter and better,” she said.

Then there is 22-year-old BA student Summaya Arshad, who turns to Gemini during moments when she feels overwhelmed. “I do not want to burden anyone with my thoughts, but I still need to let them out,” she said.

For her, the value lies not in the solution, but in the space to breathe. “Sometimes, I just ask it to tell me something positive because I need to hear it, even if it is coming from a machine. It gives the comfort of being heard,” she explained.

Their stories reflect what researchers at npj Mental Health Research uncovered in 2024. Many users described AI conversations as an “emotional sanctuary”, a space free of judgment, where one could speak without fear. But the study also pointed out that automated safety protocols sometimes interrupted these sensitive conversations, making users feel even more vulnerable when they were shut down abruptly.

Another study by PLOS Mental Health compared ChatGPT to professional therapists in similar scenarios. The AI lagged behind human therapists by just 5%. In fact, ChatGPT’s responses were more often rated as empathetic, culturally sensitive and emotionally connecting.

A More Sensitive Chatbot?

In February 2025, OpenAI updated ChatGPT’s guidelines to handle sensitive subjects, including self-harm and mental health crises, with greater care. The updates aimed to make the chatbot more empathetic and nuanced during sensitive exchanges.

These changes have improved how the model responds in emotional situations, but there is still unease among mental health experts. The unpredictability of AI-generated responses continues to raise concerns.

MBBS student Rida Fatima, 24, shared her experience of using ChatGPT during an anxious spell. “I just wanted to talk to someone because I was feeling really anxious. My roommate talks to ChatGPT a lot, so I thought I would give it a try,” she said.

“The responses it gave me really helped. It calmed me down. I felt like I can talk to it whenever I feel lonely or anxious or depressed without feeling like a burden,” she said.

‘AI Can’t Replace Traditional Therapy’

Even as more people warm up to AI companions, mental health professionals are urging caution. Dr. James Collett, a psychologist at Melbourne-based RMIT University, told Herald Sun that while AI can support reflection, it simply cannot replace the human connection that is essential in therapy. He stressed that AI lacks the ability to challenge people in ways that lead to meaningful growth.

The Wall Street Journal reported that mental health support through AI is growing and that new developments are trying to make chatbots more emotionally aware. Hybrid models, which combine AI tools with human therapists, are emerging. For example, some schools are using AI tools whose responses are reviewed by psychologists, ensuring that serious cases receive the human attention they require.

Experts strongly argue that AI can assist, but it cannot replace therapy. The human aspects of care, which are empathy, presence and emotional intuition, remain irreplaceable.

Syed Tahir Rufai, a counselling psychologist, believes people turn to chatbots because they are easy to access. “The taboo around traditional therapy, especially in South Asian countries, also makes them lean towards chatbots,” he said.

But he warns that AI does not have the full picture. “I do not think AI chatbots can be a replacement for traditional therapy or even for real human connection because it will always lack that human touch. AI cannot tailor therapy according to the human experience keeping body language, empathy and expressions in mind,” he said.

Even as experts raise red flags, the demand for AI emotional support is growing, especially for those who hesitate to step into a therapist’s office.

“Unlike humans, AI does not judge, does not get tired and does not abandon. It is not the same as a real person,” Summaya said.

That belief might bring comfort to some, but it also reveals something deeper. In a world where we seek comfort in machines, we risk forgetting how to find it in each other. These chatbots do not challenge our behaviour or push us to grow. They mirror our feelings back at us. They do not tell us when we are wrong. Over time, this echo chamber could become dangerous.

The world has already shrunk to the size of a screen. And if we do not look up soon, we might find ourselves hurtling toward a future where technology does not only mediate human connection, it replaces it. Faster than we can say dystopia.

SUPPORT TWOCIRCLES HELP SUPPORT INDEPENDENT AND NON-PROFIT MEDIA. DONATE HERE