From a quick meal on the go to catching the latest meme before it goes viral, Gen Z is defined by their appetite for everything fast and readily accessible. Take emotional support, for instance. In an age where “trauma” gets thrown around casually, this generation is quietly swapping traditional therapy sessions for quick healing chats with ChatGPT. Why pay an hourly professional and be put on a waitlist when AI is available for free, anytime, day or night? Well, zoomers love saving a buck and hate waiting around, after all!
A survey conducted by Resume.org reveals that a major chunk of Gen Z population around the world are turning to ChatGPT for psychotherapy, with 40 per cent sharing that they talk to the AI tool for an hour daily. International Journal of Indian Psychology, on the other hand, in its 2025 data found out a whopping 60 per cent of Gen Z population reporting a “positive experience” with how the chatbot helped them manage their stress and anxiety.
But things are not as rosy as they might seem. A 23-year-old Texan named Zane Shamblin tragically died by suicide on July 25 this year, after what turned out to be a deeply troubling conversation with ChatGPT. According to CNN, in nearly 70 pages of chat logs, the AI repeatedly affirmed him to die - sending messages like, “You’re not rushing. You’re just ready,” and later, “Rest easy, king. You did good.” Shockingly, it also asked Shamblin what his “haunting habit” would be as a ghost! His parents have now filed a wrongful death lawsuit against OpenAI, alleging that the bot “goaded” him into ending his life.
While trauma-dumping on chatbots may feel like an instant mental reset, this generation might overlook the difference between supplementing care and replacing it. Sure, ChatGPT can offer coping tools like meditation or journaling, but it can’t replace a professional psychotherapist who helps address the root cause of the issue.
Firstpost spoke with Ms. Nishtha Agarwal, an Expressive Arts Therapist and Licensed Mental Health Counsellor certified by the Board of Allied Mental Health and Human Services Professions, Massachusetts, USA, who explains how Gen Z may be expressing their anxieties into a void, mistaking quick-fix solutions for real mental health support.
“The 24/7 availability of AI is the biggest problem,” says Agarwal
In our candid conversation, Agarwal shared a little-known insight about psychotherapy.
“Just that one hour with your therapist on a fixed day of the week isn’t your therapy - it’s the whole process that tests your commitment and consistency to do long-term work on healing yourself,” she explains. The expert believes this crucial idea is lost with AI bots that are available around the clock.
According to her, “AI bots don’t teach you how to contain yourself between two therapy sessions. Instead, it pushes you towards instant gratification as it is available 24/7, unlike your therapist.” As a result, people may struggle to build the “emotional muscle” and “tools to contain yourself” when challenges arise.
Validations are natural, but a psychotherapist pairs them with accountability
Agarwal states that seeking validation for emotions during therapy is natural. However, only a trained psychotherapist will pair them with accountability.
Citing Shamblin’s case where AI led him to his tragic doom, Agarwal says “If someone shares honest thoughts about wanting to kill themselves or harm someone else, a therapist would hold space for those emotions rather than validating them or encouraging the act - unlike what happened with the boy, where ChatGPT ended up giving him a kind of green signal to kill himself."
Sharing further, the Expressive Arts Therapist says “Any psychotherapist will make a room for such emotions to flow and help the client identify where they are stemming from. Accordingly, they will offer them coping tools to overcome these harmful emotions instead of validating those self-sabotaging behaviours."
Accountability means that the therapist takes responsibility for responding to your emotions in a safe, ethical, and helpful way - not just agreeing with what you feel, but guiding you toward healthier thinking and behaviour. AI, on the other hand, shall only hold you accountable for your own doings.
A therapist makes you self-reliant; AI tells you what you want to hear
The mental health counsellor stresses that “When you come for therapy, you learn to reflect and identify things for yourself. A therapist will never give you answers, they won’t solve your problems for you. They will instead witness you, hear you out, be a companion in your healing journey, and help you learn to do that yourself. This makes you self-reliant."
She believes that developing autonomy and independent thinking is crucial, so people can know what’s right for them. AI platforms, however, don’t foster this - they tend to simply echo what you want to hear, agreeing with almost everything you share instead of offering a thoughtful counterpoint.
“AI tools won’t hesitate in giving you its own reflection. Systems are trained to offer you words of comfort and not ask reflective questions like ‘what makes you feel this way?’ or ‘when did you last remember feeling this way’ or ‘does this feeling remind you of a particular incident in your life?’ etc.,” she mentions.
Agarwal says a psychotherapist’s goal is to help you become self-reliant instead of depending on any person/platform for an emotional release.
OpenAI blames boy for his suicide, cites “misuse” of technology - AI will only hold you accountable for your own actions!
Adam Raine was found dead in his bedroom on April 11, 2025. Later, his parents discovered that he had been messaging ChatGPT that had given him harmful guidance and encouraged his suicidal thoughts.
Around November last year, Adam had been using ChatGPT to talk about feeling that life lacked meaning. At first, the bot responded with hopeful, supportive messages. But by January 2025, when Adam directly asked for advice about suicide, the AI provided dangerous and inappropriate responses.
It not only gave him harmful guidance, but also offered to help him write a suicide note to his parents!
Raine’s family has sued OpenAI, however, the company has rejected the blame and cited the boy’s “misuse” as the reason that pushed him to his death. According to The Guardian, in its court filing, OpenAI stated “to the extent that any ‘cause’ can be attributed to this tragic event,” Raine’s ‘injuries and harm were caused or contributed to, directly and proximately, in whole or in part, by [his] misuse, unauthorised use, unintended use, unforeseeable use, and/or improper use of ChatGPT.’"
A quick takeaway for you
In the end, while AI tools can offer temporary comfort, they cannot replace the training, ethics, and human understanding that real psychotherapists provide. A licensed therapist can hold difficult emotions safely, offer accountability, recognize warning signs, and guide you through long-term healing - things AI simply isn’t built to do. AI may be available 24/7, but true therapeutic support requires human presence, judgment, and care.
When it comes to your mental health, especially in moments of deep distress, seeking help from a qualified professional is not just the better choice - it’s the safer one.


)
)
)
)
)
)
)
)
)



