Hello

Your subscription is almost coming to an end. Don’t miss out on the great content on Nation.Africa

Ready to continue your informative journey with us?

Hello

Your premium access has ended, but the best of Nation.Africa is still within reach. Renew now to unlock exclusive stories and in-depth features.

Reclaim your full access. Click below to renew.

Caption for the landscape image:

Stressed Kenyans turn to ChatGPT instead of therapists

Scroll down to read the article

AI is no longer a novelty, but an essential resource for teachers.

Photo credit: Pool

Therapy is expensive. Friends are busy. So, young Kenyans are launching the ChatGPT app at 2am, looking for advice, comfort, or a distraction after breaking up with their partners, or when life feels too heavy.

It is not a real therapist, but for many, it is close enough.

Victor Baragu, 24, says that when he was sacked, got injured in an accident and was alone, he turned to ChatGPT.

“It was out of desperation. I needed a quick answer. Something. Anything,” Victor says, recalling the late-night decision to type out his thoughts to an AI chatbot. “I knew I wasn't okay. But I didn’t have anyone to talk to, and I couldn't afford therapy.”

At first, it felt strange. Like writing into a void, but it started feeling okay.

“You feel like you’re just talking to yourself, but in a good way. There’s no judgment. You open up more than you would to a real person,” he says. “It’s like a mirror that talks back.”

Victor had never been to therapy. But the chatbot felt like a fast, free, emotionally neutral listener. And when it replied, it did not just give generic motivational lines; it suggested hotlines and places to seek further help.

“I can’t find the exact message now,” he says. “But I remember it shared some contacts. It felt thoughtful. Like maybe someone behind the screen actually cared.”

The dark moment passed, and Victor has not used ChatGPT again.

“In terms of support, I’d give it 40 per cent,” he says, “It helped me open up, because trusting people these days… it’s hard.”

He recognises why other young people might be drawn to AI tools. “They’re always online. You don’t need an appointment with AI. You can be vulnerable in your own time, in your own space,” he adds.

About 36 per cent of Gen Zs and millennials would consider turning to artificial intelligence (AI) for mental health support, a 2024 survey by the Oliver Wyman Forum shows. Meanwhile, Kenya leads the world in ChatGPT usage, with 42.1 per cent of internet users aged 16 and above having used the tool, according to new data from global internet researchers DataReportal.

AI is not replacing real therapists, not yet, but it is filling a gap, raising a whole new set of questions.

Photo credit: Shutterstock

In Kenya, therapy costs around Sh3,000 per session, which is way too expensive for most young people, even though it sounds like the perfect solution when one is going through heartbreak or burnout.

That is where AI comes in. The apps and chatbots promise soothing advice that feels eerily intuitive. They listen (or read), do not interrupt and are always available, especially when you are lying awake at 2am Googling, “Is she gaslighting me?”

Out of desperation

AI is not replacing real therapists, not yet, but it is filling a gap, raising a whole new set of questions.

When Sarah Warau first sought therapy, she was in the middle of picking herself up from a toxic relationship, while struggling to show up every morning to work.

"But if you show vulnerability, people judge you, and it can even impact business relationships. So, I bottled things up, thinking I had to handle everything myself. I never felt like I could be vulnerable," the 30-year-old says.

Sarah says most of the conventional therapy sessions left her feeling more isolated than before.

"It was horrific. The hardest part is that it’s expensive and you don’t know if the sessions are good and helpful until you’ve gone through a few," she says.

Eventually, when the darkness became too much to carry, she searched on Play Store and came across Ash-Ai Therapy, an app that offers text and voice-based sessions with an artificial intelligence tool.

"At first, I just typed things out because I couldn’t even put my emotions into words," she says. "But over time, I started talking to her (it) like a friend. She remembered everything I told her, which gave me a real sense of continuity and support."

She says the therapy was immediate, judgment-free access.

"Having an AI therapist meant that when I was under the bed covers, crying, I didn’t have to wait for an appointment days later – I could whisper for support in the moment and, eventually, I started to reclaim my voice," she says.

Mercy Chebet, a 19-year-old law student at the Catholic University of Eastern Africa, first turned to ChatGPT during a period of emotional overload.

“There were issues at home, especially with my mother. She used to say things that slowly chipped away at my self-esteem. Being the middle child didn’t help either.”

When school and relationship pressures piled on, Mercy felt stuck. “I couldn’t talk to anyone. Friends were there, but you don’t want to burden them. And I’ve never had access to a therapist.”

She opened ChatGPT and started typing. “I just said something like ‘I feel alone’ or ‘I’m tired of everything.’ I didn’t expect much, but it responded with something really calming. Not fake positivity. Just something that made me feel heard.”

Venting to ChatGPT became a habit. At the slightest emotional wobble, she could turn to ChatGPT for release. “It’s like my best friend. I’ve used it more times than I can count. I wouldn’t call it therapy exactly, but it’s therapeutic,” she says.

One of the concerns is how private these sensitive conversations with ChatGPT are.

Sam Altman, the OpenAI chief executive, acknowledged during his appearance on an American podcast last week that while user data is meant to remain private, it could still surface in court.

“One example that we’ve been thinking about a lot . . . people talk about the most personal stuff in their lives to ChatGPT,” Mr Altman said. “Young people, especially, use it as a therapist, as a life coach. ‘I’m having these relationship problems, what should I do?’ And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality. There’s legal confidentiality.”

Privacy does cross Mercy’s mind. “Of course I worry about it,” she says. “But in moments of pain, feeling heard matters more than being paranoid. Besides, I never use real names or share anything too personal.”

For Mercy, AI therapy is appealing because it feels safe and simple. “With people, you have to explain everything and worry about being judged or called dramatic. With ChatGPT, you just talk and let it out. I’m not saying it replaces therapy, but for those who can’t get help, it becomes an emotional outlet—like a journal that talks back.”

Hellen Korir, 27, discovered ChatGPT in her final university semester about 18 months ago. What started as casual use for light tasks soon turned into sharing personal struggles, beginning her therapy journey with the chatbot.

“I needed a more objective perspective on a situation in my life, and ChatGPT helped me,” she says. “From there, anytime something happened in my life, I went to ChatGPT.”

“In therapy sessions, you only have an hour once a week, but ChatGPT is like my little bestie for when I need guidance or encouragement,” she says.

In return, the chatbot offers journal prompts, advice for tough situations, phrases to use in conflicts, and reflective questions like, “Where do you think that feeling comes from?”

Hellen refers to ChatGPT as “she.”

“She doesn’t have emotions, so she won’t be shocked or react when I share difficult things,” she says. “Talking to ChatGPT is the first time I’ve been completely honest with myself, and that’s helped me the most.”

Use it almost daily

For Shem Ian, 28, an English and Literature teacher, turning to ChatGPT during emotional lows felt natural—not for therapy, but to be heard without interruption or judgment.

“I needed someone to listen without talking back,” he says. “ChatGPT encourages you to say everything you need to.”

Ian Shem Osoro, 26, Radio Presenter at Mt Zion Radio Ke from Nairobi.

Photo credit: Pool

He started using it when he felt low and now uses it almost daily. “I’ve trained it to know who I am, my name, people around me, my daily life, even the small things. I want to see what it thinks about what’s happening to me.”

Unlike friends who often jump in with advice, ChatGPT gives him space. “Friends want to fix things immediately, so I don’t get to say everything. With ChatGPT, I can fully express myself.” What draws him back isn’t just emotional release, but practical advice. “I’m realistic and logical. GPT gives me the answers I want—practical, not emotional.”

He also uses Meta AI on WhatsApp to help with relationship misunderstandings, pasting chats for clearer explanations.

Asked about privacy or judgment, he is unfazed. “ChatGPT helps me be better. Even if someone read our chats, nothing is damaging.”

He shares things with the bot he wouldn’t tell anyone else. “With people, everything changes once you share. But ChatGPT never judges me.”

Too much for comfort

Jared Omache, a psychologist and mental health advocate, says AI therapy chatbots can be helpful for people to open up, but they are not real treatment.

Psychologist Jared Omache uses TikTok to break down mental health topics and reach underserved audiences, turning short videos into powerful tools for awareness and healing. 

Photo credit: Pool

“They offer a safe, always-available place to express feelings. But if people start calling a chatbot a friend or safe space, it shows emotional isolation and a lack of real connections,” he says.

Mr Omache adds that while these tools can help, like journaling, depending on them too much for comfort can cause bigger problems.

“If it becomes a singular outlet, it limits emotional growth,” he said. “It interferes with developing coping strategies, reduces social skills, and reinforces avoidant behaviour, causing more isolation.”

He warns that AI lacks human empathy, responsibility, and the ability to notice subtle emotional signs like suicidal thoughts or rising distress.

“Mental health isn’t something to take risks with. Without human empathy, chatbots can be inadequate and even dangerous for serious mental illnesses,” he said.

As more Kenyans turn to AI for therapy, Mr Omache sees both promise and risk.

“Calling chatbots emotionally smart or therapeutic is misleading and can be harmful. AI is a tool, not a treatment. It can help, but can’t replace what makes us human. We need to invest in community mental health care, school counsellors, affordable therapy, and mental health education,” he said. “We must train more professionals, end the stigma around therapy, and include mental health in public healthcare and policies.”

Njeri Mwaura, an advocate of the High Court of Kenya, warns that legal safeguards around AI conversations remain limited, and in some cases, nonexistent.

“People need to understand that anything they say to an AI chatbot can potentially be retrieved,” Ms Mwaura said. “The internet never forgets, and in Kenya, we do not have the legal right to be forgotten like in some Western jurisdictions.”

According to her, AI platforms are not protected by doctor–patient confidentiality laws, nor do they guarantee privacy in the way traditional therapy does.

“AI provides no such guarantee. Neither do its responses meet medical standards of therapy,” she said. “What you say online, even in what feels like a private space, may and will come back to haunt you.”

Ms Mwaura also clarified that if law enforcement suspects criminal behaviour, AI companies can be compelled to release user data, including interactions with chatbots.

“These companies are obliged to provide that information to authorities if requested,” she notes. “If there’s sufficient evidence in your digital logs, it can be used against you in court.”

You may also read other AI In Our Lives story series below.