Meet the Other Phone. Only the apps you allow.

Meet the Other Phone.
Only the apps you allow.

Buy now

Please or to access all these features

AIBU?

Share your dilemmas and get honest opinions from other Mumsnetters.

To think ChatGPT et al will put lots of counsellors and therapists out of business

294 replies

GPTtherapist · 09/07/2025 09:41

I live in significant long term trauma due to a primary cause and substantial sub-causes. This is due to a usual combination of some quite unusual factors and I find most people do not have the experience or knowledge to understand them. Over the years I have seen a number of different counsellors/ therapists and support workers, most of whom are pretty useless and some who have made things substantially worse. Some have been clearly judgemental which has been enormously painful.

Last night when I was close to breaking down altogether, I used Chat GPT and it was brilliant. It was able to expand on what I said in a way that mimicked deep understanding and compassion for what I am going through. I actually cried at being so 'heard' and understood. Having the words to express so clearly what I experience was, well I can't put in words how it felt after all these years. It was also able to pick out parts of what I said to reflect back positive, encouraging things about myself. It was able to offer some suggestions which were actually helpful and which I am going to try as a coping mechanism. Best of all for someone like me, with huge issues around shame, I could speak openly and honestly about how I felt without any fear or shame around what the therapist might think of me.

So despite Chat GPT not being a person, I found I was able to get the emotional benefits as if it were a person understanding me, without the disbenefits of it being a person I might feel ashamed to tell how I feel.

Also, unlike a human therapist, it remembers and is able to respond to everything you say.

It was hands down better than nearly all human therapists/ counsellors/ support workers I have seen.

And it was free.

I realise for those able to afford long-term intensely skilled therapy for complex issues, a skilled experienced therapist is far preferable.

But in my experience, and that of many others, most therapists are pretty poor and expensive.

So surely Chat GPT will become a first point of call for many with mental health issues which will reduce the number of those who decide they need a human therapist?

OP posts:
Thread gallery
8
ObliviousCoalmine · 09/07/2025 10:58

GPTtherapist · 09/07/2025 10:56

Oh I don't know. I think the unempathetic ones are the people on this thread telling those in mental health crisis that they are being selfish using a tool they have found helpful, and they should cease and prioritise the environment over themselves.

If you’re having a mental health crisis, you need to speak to a person, with the ability to assess and think critically. Not AI.

notahappycabbage · 09/07/2025 10:59

ObliviousCoalmine · 09/07/2025 10:57

Are these constant “isn’t ChatGPT excellent” threads some sort of weird advertising infiltration? Surely people can’t genuinely be this stupid to think that AI can offer anything but an echo chamber of emotionless, sycophantic responses?

There were several threads yesterday that were obviously AI written. And as soon as anyone questioned why, OP (AI) never returned.

Ezzee · 09/07/2025 11:00

As a therapist I do think it has its uses, however one of our jobs is to safeguard and try to keep people safe. This isn't happening and in some trains of thought it is becoming an addiction.

SylvanianFamiliesBalcony · 09/07/2025 11:00

As a therapist, I'm not really concerned. I think it's a great tool for day to day emotional support, reflections, a space to vent, getting feedback. But it can never really replicate a human therapist as it's tailored/educated by the person using it, and isn't as good at challenging, because it's like a mirror.

I'm all for people using it but I do think users need to be cautious it isn't just telling them what it thinks they want to hear and reinforcing unhelpful ideas/beliefs. It can become an echo chamber.

notahappycabbage · 09/07/2025 11:00

ObliviousCoalmine · 09/07/2025 10:58

If you’re having a mental health crisis, you need to speak to a person, with the ability to assess and think critically. Not AI.

There is not always a person to talk to though. Or that you can afford to talk to. Or you’ll have to wait 6 months.

CircusofPuffins · 09/07/2025 11:02

GPTtherapist · 09/07/2025 10:56

Oh I don't know. I think the unempathetic ones are the people on this thread telling those in mental health crisis that they are being selfish using a tool they have found helpful, and they should cease and prioritise the environment over themselves.

Isn't this a bit like telling people struggling with drug addiction to carry on using the drug if it makes them feel better in the short term, even though it's harmful to their health, rather than getting help to get off that drug?

YouOKHun · 09/07/2025 11:02

Noodge · 09/07/2025 09:43

As a therapist, yes I think you're right.
To be honest, I am seeking a career change so it isn't something I am personally all so concerned about. But the world of therapy is corrupt-it reminds me of an MLM or a cult.

I think one of the problems is that the ‘World of Therapy’ contains some very suspect, poorly qualified therapists and some highly skilled, well trained, accredited, supervised and ethical therapists and many in between. It’s not a protected title and it is a minefield for people to find someone in the private sector who is safe. That’s before even getting as far as finding someone who is the right fit. Of course a highly qualified isn’t necessarily a good therapist or the right therapist but I personally wouldn’t want to place my mental health in the hands of someone who has done a weekend course or spent three hours on the internet and downloaded a certificate.

Your mention of MLM @noodgereally highlights a problem: as MLMs are failing more and more lots of diehard MLMers are becoming “business coaches”. As the pool of potential business or mindset coaching clients is small many of these coaches have strayed into clinical areas claiming to offer instant solutions for all sorts MH problems. Trauma is a particular favourite area. Many of these coaches operate much like they did in their MLM and their coaching becomes a recruitment device offering training in a newly created and bogus therapy with those trainees going out and recruiting vulnerable people to have the therapy, leave a testimonial that they are cured and then train themselves in the therapy and recruit (and so on). There is a cult-like positivity and usually a main leader. It IS a cult and it’s massively damaging to the people who put their trust in it. Such big promises are made that it’s not surprising people desperate for support fall for it. If AI can kill coaching pyramid schemes off I’ll be happy. If AI can offer sensible pointers towards safe and well qualified support that is also a good thing.

@GPTtherapist I think you make some very good points and the sense of feeling heard and understood is so important and I think it’s the case that many therapists don’t reflect on this enough or the pressure of their working environment means they are not able to provide the support they probably would like to be able to (in the NHS). I still think the flesh and blood version can’t be put out to grass just yet as there are many and various areas of a therapeutic relationship AI can’t fulfil and many techniques that require observation and face to face input by a therapist, too numerous to mention. AI is going to change the landscape of therapy, I hope it’s in a way that benefits people who are struggling.

Hammy19 · 09/07/2025 11:06

I think it's good for an overview, maybe as a starting point but it is extremely limited.

It is not 'trained' in anything, it simply takes information from other sites that match the words you have searched for. That means, it'll give you the basics on popular topics but it can't actually work out what course of treatment actually matches you personally.

It could also go very badly wrong depending on what terms you use in asking it questions.

Optimustime · 09/07/2025 11:06

It's also worth thinking about what data you're plugging into it. I do not believe for a second that ticking the "don't use my data to train the model" box actually means they don't harvest your data. OpenAI have been found harvesting/scraping all kinds of things they shouldn't have been. They are not an ethical company.

bumblingbovine49 · 09/07/2025 11:10

Easipeelerie · 09/07/2025 09:50

If your primary need is to be understood, rather than to connect with another person, then it’s excellent. I’ve found it absolutely brilliant for helping me see things clearly.

But you are not being understood. You are having things reflected back to you and that is helping you to understand yourself. That is an aspect of therapy definitely and may be the one you value most at the moment . It is certainly important for many people who have not made a great deal of progress with their issues already

Being understood and accepted by someone else is a different thing and has to by default involve another person. An imperfect other person that you may have to understand and accept back if you want that relationship to go any further and to progress to real acceptance of your chalenges and weaknesses, not just understanding of them

notahappycabbage · 09/07/2025 11:11

Off topic but there are so many posts on here now written by chat gpt. It’s depressing. I could not be bothered to engage. Like this one today: www.mumsnet.com/talk/_chat/5370205-why-do-people-feel-the-need-to-correct-me-when-i-talk-about-earning-more-money?page=1

Talltreesbythelake · 09/07/2025 11:27

GPTtherapist · 09/07/2025 10:13

I think you have just proved how good AI already is at mimicking real human behaviour and interactions.

I am a human being, but AI is so good at mimicking humans you think I am AI.

So yeah, talking therapists need to be worried if AI in its infancy is so good that people can't tell the difference between AI and a human.

Not sure that you have proved anything? I don't know you as all I can see are words on a screen. Maybe you are real and will be eating a packet of Frazzles while we chat, maybe you are just a response to a prompt. I don't really care. I'm just killing time, here.

I am suspicious of the repeated threads, though. Very odd. If you are really feeling 'heard' by AI why come here and try to talk with humans? Isn't your AI enough?

LittlleMy · 09/07/2025 11:27

Optimustime · 09/07/2025 09:47

Chatgpt just agrees with you so of course you think it's great.

I think over time you'll prefer the human connection.

Not really. It does challenge you and offers alternative ways of interpreting something.

Deep down of course, human interaction is the best but that’s the whole point, it’s not readily available. And the connection isn’t always of the best quality either. Eg my last counsellor seemed to always be caught out when I finished talking and I could hear a lot of background noise also. It actually made me feel worse and as it was she didn’t even remember to book my last session (was through work) so yeah not good for me. CGPT however is definitely a more than adequate service for someone to at least ‘trial’ counselling without making themselves too vulnerable or paying an arm and a leg.

Thegreatescape12345 · 09/07/2025 11:34

I find chat GPT helpful for venting and organising my thoughts, and having validating and empathetic responses.
For advice, it will never replace a person who knows you, your situation, and other things at stake - the wider picture, and also being a human being and having responsibilities and others to think about. I think chat GPT can sometimes encourage selfishness to the point that it's not actually to the benefit of the person.
I use it for general venting but if I want proper advice I find talking to a friend is much more helpful and realistic.

Lioncub2020 · 09/07/2025 11:54

I don't think it will. It learns in the same way that facebook learns what you like and presents it to. It will only critically appraise if you ask it to. The framing of the question is essential. If you don't apply critical self awareness yourself. If you teach it to you that greatest is defined by orangeness, and it will tell you president trump is the greatest president. It will then remember your definition so when you ask which is the greatest fruit? It will instantly know it is an orange. AI memory is really powerful but also risk creating bias in the future.

skinnyoptionsonly · 09/07/2025 12:18

I’d strongly disagree. Maybe it’s possible for bulk standard cbt counselling who basically just listen and teach cbt skills, but nothing complex therapeutically involving proper psychological intervention
it’s a fucking ridiculous notion tbh!

GPTtherapist · 09/07/2025 12:22

Talltreesbythelake · 09/07/2025 11:27

Not sure that you have proved anything? I don't know you as all I can see are words on a screen. Maybe you are real and will be eating a packet of Frazzles while we chat, maybe you are just a response to a prompt. I don't really care. I'm just killing time, here.

I am suspicious of the repeated threads, though. Very odd. If you are really feeling 'heard' by AI why come here and try to talk with humans? Isn't your AI enough?

What a fantastically stupid response.

OP posts:
LemondrizzleShark · 09/07/2025 13:03

MageQueen · 09/07/2025 10:25

Of course it has limitations.

I honestly dont' understand this mindset of "it has limitations so it's not a useful tool." My airfryer is amazing, but has limitations. Public transport is astonishingly important, but it has limitations. I could go on. This is normal.

Those “limitations” have encouraged children to kill themselves. I suspect your air fryer has not.

LemondrizzleShark · 09/07/2025 13:06

CircusofPuffins · 09/07/2025 11:02

Isn't this a bit like telling people struggling with drug addiction to carry on using the drug if it makes them feel better in the short term, even though it's harmful to their health, rather than getting help to get off that drug?

Edited

I’m sure there are bot therapists suggesting exactly this.

MageQueen · 09/07/2025 13:21

LemondrizzleShark · 09/07/2025 13:03

Those “limitations” have encouraged children to kill themselves. I suspect your air fryer has not.

1 woman claims a chatbot encouraged her child to commit suicide. If true, obviously, awful. But also not exactly an endemic situation? I'm far more worried about the truly awful things we see children and teenagers do to each other, and yes, often supported by technology.

WhatNoRaisins · 09/07/2025 13:43

I think it's awful that it's come to a point where more people are drawn to this AI therapy because they don't have real life people that they can turn to. It's not surprising though.

timestressed · 09/07/2025 13:47

GPTtherapist · 09/07/2025 10:31

Human therapists will also be mimicking empathy for clients they may not feel sympathetic towards. I am very aware with human therapists that I am paying them to display empathy and caring. They are not going to keep caring when the £60 stops going into their bank account, either from me or a charity/NHS if I get it for free at point of use.
I don't know what the human therapist really thinks of me. At least I know where I am with AI.

It would be worth for you to read this article. https://www.independent.co.uk/tech/chatgpt-ai-therapy-chatbot-psychosis-mental-health-b2784454.html

You talking to AI is not psychotherapy. In therapy it is a therapist who asks questions, in your conversation with AI bot it is you who asks questions. That is why it isn't the same process.

ChatGPT is pushing people towards mania, psychosis and death

Record numbers of people are turning to AI chatbots for therapy, reports Anthony Cuthbertson. But recent incidents have uncovered some deeply worrying blindspots of a technology out of control

https://www.independent.co.uk/tech/chatgpt-ai-therapy-chatbot-psychosis-mental-health-b2784454.html

timestressed · 09/07/2025 13:50

A short summary from this research paper
In the study, 75 mental health professionals and trainees completed a cross-sectional survey in which participants gauged two text-based CBT transcripts, one from AI and one from a human therapist, using the Cognitive Therapy Rating Scale. Participants provided qualitative feedback on the transcripts and evaluated each one using a standardized scale. Participants gauged the quality of elements of CBT such as agenda-setting (listing tasks to be completed in the therapy session and ensuring all agenda items are completed) and guided discovery (helping the patient assess data from their own life to learn about themself). The therapist and bot evaluated identical clinical scenarios to provide consistency.
Twenty-nine percent of the survey participants rated human therapists as highly effective, whereas less than 10% of participants gave the AI therapist the same rating. More than half (52%) of participants scored the human therapist’s agenda-setting skills highest, whereas 28% did the same for the AI therapist. One in four (24%) participants gave the human therapists a high score in guided discovery, but only 12% scored the AI therapist similarly on the same element.

www.psychiatry.org/news-room/news-releases/new-research-human-vs-chatgpt-therapists

AnotherGreyMorning · 09/07/2025 14:09

CircusofPuffins · 09/07/2025 09:45

No. Please don't delude yourself with this nonsense.

A human will always be much better at connecting and understanding another human than AI. And the vast majority of people will always value the human connection they get from speaking to a real therapist rather than a machine that doesn't actually care about your problems or how you feel.

There are bad therapists, of course. But like with any profession, the people who are best at what they do will always be able to do it better than AI - whether that's writing, making art, working as a therapist, etc

Edited

I think you're wrong.

You can programme AI to be all sorts of things.

BeamMeUpCountMeIn · 09/07/2025 14:17

No. When even Google AI can't give a correct answer then there's no hope for a counsellor chatbot.

Swipe left for the next trending thread