Are AI Therapy Chatbots Safe? What You Need to Know

20 Nov 2025 | AI therapy, Counselling, Online therapy

With long NHS waiting lists and rising demand for mental health support, more people in the UK are turning to AI therapy chatbots — mental health apps like Wysa or Woebot, and general-purpose AI like ChatGPT — for help, advice and emotional support.

A recent report in The Guardian highlighted therapists’ major concerns: that some people may be “sliding into an abyss” by relying on AI instead of seeking help from a trained, qualified counsellor.

How common are AI mental health chatbots in the UK?

 

  • A poll by Mental Health UK found that around 37% of UK adults have used an AI chatbot for mental health support.
  • According to a survey from the Ada Lovelace Institute, about 7% of UK adults say they have used a mental health–specific chatbot, which represents roughly two million people.
  • Research by YouGov also shows that young adults (18–24) are among the most comfortable talking to chatbot‑based services; in some NHS services, AI tools like Limbic are being used to support triage and assessments.
  • Meanwhile, BACP (British Association for Counselling & Psychotherapy) therapists report concern: some practitioners notice children or younger clients turning to AI for mental health advice — but highlight that AI lacks the training and safeguarding of human therapy.

Where AI therapy chatbots fall short

 

Despite their appeal, AI mental health chatbots come with serious limitations:

  • Inability to reliably detect crisis or risk — they may miss signs of self-harm or suicidal ideation.
  • No professional training or ethical oversight — unlike human counsellors, they don’t follow regulated guidelines.
  • Potential for inaccurate or harmful advice, because responses are based on patterns in data, not real clinical judgment.
  • No authentic therapeutic relationship — empathy and human connection can’t be genuinely replicated.
  • Not suitable for complex emotional issues, such as trauma, long-term mental health conditions or deep relational healing.
  • AI can supplement — but not replace — human counselling
  • AI tools can play a helpful supporting role. For example, they are useful for:
  • Journaling prompts or guided self-reflection
  • Mood tracking
  • Basic psychoeducation

But for safe, therapeutic support, a qualified human counsellor is irreplaceable.

If you’re thinking of seeking support

 

Talking with a qualified counsellor — like someone from our team at Affordable Counselling UK — provides trusted, confidential, and ethical care that goes beyond what AI can offer.

“AI tools may be tempting, but therapy with a real person offers a level of connection, accountability and safety you won’t find in a chatbot.”

At Affordable Counselling UK, our mission is to make professional counselling accessible to all — because everyone deserves human care when it matters most.

Explore more

Want to read more articles with practical tips and support to help you on your journey? Browse our other blog posts here.

Contact us

If you would like to take the next step towards your counselling journey, we’d be delighted to hear from you. Please reach out to us by email, our Contact form or by WhatsApp, text or call on 07597 302810.