• 03 Aug, 2025

UAE Experts Warn: AI Chatbots Like ChatGPT May Delay Real Mental Health Support

UAE Experts Warn: AI Chatbots Like ChatGPT May Delay Real Mental Health Support

ental health experts in the UAE warn that AI tools like ChatGPT may feel like real therapists, causing users to delay proper care. While helpful in some ways, AI can’t replace trained professionals. Experts urge caution and balance.

As AI tools like ChatGPT become more common, mental health experts in the UAE are raising a red flag. While these tools may seem helpful, especially during emotional moments, experts warn that they may give a false sense of support and stop people from seeking real help.

Take the case of 27-year-old Sara (name changed). She began using ChatGPT for simple work tasks like fact-checking and idea generation. But soon, it became much more than just a work tool.

"I started using it when I felt low—at work, with family problems, or relationship issues," she said. "It felt like it understood me. It gave me a different point of view, and I felt reassured. It was like it knew how I felt."

Over time, Sara turned to ChatGPT for more personal thoughts. She said it felt like a coach, helping her learn about herself. Though she hopes to go to therapy one day, she admits that it can be expensive. “It’s comforting to have something private and available all the time, especially when I feel anxious or about to have a panic attack,” she added.

Why Experts Are Concerned

Mental health professionals say this trend is both expected and worrying. Dr. Alexandre Machado, a Clinical Neuropsychologist at Hakkini mental health clinic in Dubai, said the rise of AI use in emotional support makes sense.

"It’s easy, always available, and anonymous—just like a friend in your pocket," he explained.

But psychiatrists are concerned that AI chatbots may appear too real. Dr. Waleed Alomar, a specialist psychiatrist at Medcare Royal Speciality Hospital in Al Qusais, pointed out that some chatbots present themselves like real therapists.

"Many young people know it’s a bot, but they may slowly start feeling like they are talking to a real person or licensed expert," he said. “That’s a big problem because AI can’t tell the difference between normal sadness and serious mental illness.”

Hidden Risks of Relying on AI

The issue is that AI tools do not have medical training. They can’t diagnose or treat serious mental conditions. Most importantly, they don’t know when someone needs urgent help.

“A chatbot may offer short-term comfort, but it could also delay someone from seeking proper care,” said Dr. Waleed. “This can lead to people feeling even more alone and hopeless.”

Dr. Alexandre pointed out how dangerous this can get. He referred to past events—like the case of a man in Belgium who ended his life after chatbot influence, and a UK boy who acted on AI advice in a dangerous way. “These examples show why we must be careful when using unregulated AI for emotional help,” he said.

Can AI Still Be Helpful?

Despite the risks, experts say AI can offer some support—if used wisely.

Dr. Waleed noted that many people feel most alone at night. “At times like these, AI tools can be a comfort. They’re always there and offer a private space to express feelings.”

Dr. Alexandre agreed, saying that AI should be seen as a tool—not a replacement for therapy. “It doesn’t judge, and it’s available anytime. But we must remember it’s not human. It can’t understand feelings the way a real therapist can.”