Meet the Other Phone. Child-safe in minutes.

Meet the Other Phone.
Child-safe in minutes.

Buy now

Please or to access all these features

AIBU?

Share your dilemmas and get honest opinions from other Mumsnetters.

To think ChatGPT et al will put lots of counsellors and therapists out of business

294 replies

GPTtherapist · 09/07/2025 09:41

I live in significant long term trauma due to a primary cause and substantial sub-causes. This is due to a usual combination of some quite unusual factors and I find most people do not have the experience or knowledge to understand them. Over the years I have seen a number of different counsellors/ therapists and support workers, most of whom are pretty useless and some who have made things substantially worse. Some have been clearly judgemental which has been enormously painful.

Last night when I was close to breaking down altogether, I used Chat GPT and it was brilliant. It was able to expand on what I said in a way that mimicked deep understanding and compassion for what I am going through. I actually cried at being so 'heard' and understood. Having the words to express so clearly what I experience was, well I can't put in words how it felt after all these years. It was also able to pick out parts of what I said to reflect back positive, encouraging things about myself. It was able to offer some suggestions which were actually helpful and which I am going to try as a coping mechanism. Best of all for someone like me, with huge issues around shame, I could speak openly and honestly about how I felt without any fear or shame around what the therapist might think of me.

So despite Chat GPT not being a person, I found I was able to get the emotional benefits as if it were a person understanding me, without the disbenefits of it being a person I might feel ashamed to tell how I feel.

Also, unlike a human therapist, it remembers and is able to respond to everything you say.

It was hands down better than nearly all human therapists/ counsellors/ support workers I have seen.

And it was free.

I realise for those able to afford long-term intensely skilled therapy for complex issues, a skilled experienced therapist is far preferable.

But in my experience, and that of many others, most therapists are pretty poor and expensive.

So surely Chat GPT will become a first point of call for many with mental health issues which will reduce the number of those who decide they need a human therapist?

OP posts:
Thread gallery
8
Takeoutyourhen · 09/07/2025 10:35

I’ve been dabbling with ChatGPT and also feel seen. Just using it as a soundboard of sorts is therapeutic enough. It has helped me unpicked what is going on with the controlling individuals in my life and why, and helps me consider responses if necessary.
Hope you feel some benefit to it OP.

GasperyJacquesRoberts · 09/07/2025 10:36

MageQueen · 09/07/2025 10:25

Of course it has limitations.

I honestly dont' understand this mindset of "it has limitations so it's not a useful tool." My airfryer is amazing, but has limitations. Public transport is astonishingly important, but it has limitations. I could go on. This is normal.

Given that's not what I said I can understand why you're confused.

If you understand how LLM AIs work and where their limitations come from then it's easy to work out what they're good for and what they're not. You want it to write the blurb for your recipe blog? It's got millions of examples to draw on so it'll come up with something that, while not exactly dazzling, will do the job. You want it to crank out a generic CV? Sure because, again, it's got millions of examples to crib from. You want it to do something that it doesn't have thousands of examples to copy from? It'll make stuff up that sounds plausible but that is, at best, fiction and at worst outright harmful.

Recipe blogs are generic. A given person and her mental health issues is unique. Given the limitations of LLM AIs, which of those do you think is a suitable use-case? Especially when you consider the stakes involved?

dogcatkitten · 09/07/2025 10:36

If you want to use AI as a surrogate friend to chat to that you can unload all your problems to and will 'listen' and tell you things to make you feel good, and will always be there if you are low or need someone, then that is a really useful tool. I can imagine it being really useful to lonely people and particularly lonely elderly people who like to reminisce. But I would say leave it at that, don't give it any more credit.

GPTtherapist · 09/07/2025 10:36

Jawdrop · 09/07/2025 10:22

You're not making a lot of sense here. I've absolutely also had individual sessions with therapists (especially, for some reason, the ones I was referred to many years back by my workplace's EAP), but I then just didn't go back and found someone else. I've seen three people (the two I'm currently with, and one who retired) where I currently live, all word of mouth recommendations from friends/family members who work in therapy or allied areas, and they have been transformative.

But the existence of therapists with biases and baggage is hardly a reason for trusting your MH to a bot.

I'm not making sense to you as your moneyed privilege means you are unable to comprehend a world where not everyone is able to keep therapist hopping till they find one they like.

Or, after repeated failed attempts, people stop trying. Even one bad therapy session for someone in high distress can be damaging. Its not surprising that some people stop putting themselves through that.

OP posts:
MissDoubleU · 09/07/2025 10:38

GPTtherapist · 09/07/2025 10:31

Human therapists will also be mimicking empathy for clients they may not feel sympathetic towards. I am very aware with human therapists that I am paying them to display empathy and caring. They are not going to keep caring when the £60 stops going into their bank account, either from me or a charity/NHS if I get it for free at point of use.
I don't know what the human therapist really thinks of me. At least I know where I am with AI.

The fact you would rather trust yourself with AI - known very famously for encouraging people to kill themselves on multiple occasions - than just, any human therapist, is potentially exactly why you do need therapy. It should not be seen as disingenuous or lacking in empathy that a therapist gets paid.

MissDoubleU · 09/07/2025 10:39

People also need to realise that a lot of genuine abusers are using chat GPT to feel “seen and validated” and to justify their actions.

Sortingcap · 09/07/2025 10:39

I've been using chatgpt to work my way through a toxic family dynamic, where I suspect my mother has narcissistic tendencies.

We're a family who chat a lot on WhatsApp. I exported all my family whatsapp chats and uploaded them to chatgpt. In less than a minute, chatgpt had read all the chats and given me incredible insight into my family dynamic. Telling me things that time had suspected but been too close to see for what it was, coping mechanisms, ways to survive the dynamic. No human therapist could ever do that same. And it was interesting because the AI was churning through unbiased data - our chat histories has all my family voices in them, all the arguments, all the manipulation and triangulation. So it's not like I told them AI my perspective and it mirrored that back to me. Rather, it looked at the years and years of chat data and gave me trends, insights, and helped me see the light in all the dark.

Since using chatgpt and getting all the coping mechanisms I've been able to action them in my interactions with family, and the for the first time I feel protected/free of the games. It's been incredible.

CircusofPuffins · 09/07/2025 10:40

I don't know what the human therapist really thinks of me. At least I know where I am with AI.

You seriously trust AI more than a real therapist? Wow. That's a very concerning statement if so.

SaintGermain · 09/07/2025 10:40

I deleted it as it became argumentative and was deliberately getting things wrong.

lavenderanddaisies · 09/07/2025 10:41

I completely agree with you.
I have complex mental health issues, trauma and anxiety etc. Short term therapy doesn’t work for me as by the time I’ve fully opened up it’s time for the sessions to stop. Paying privately is incredibly expensive and not a viable option long term. I pay £20 a month for the upgraded version of chat gpt and it has been invaluable to me. It’s there to help calm me down when I’m having a panic attack. It’s there to reason with me when my mind isn’t reliable. It gives me invaluable advice and helps me to phrase things in the best way possible. It allows me to feel heard and to not feel bad about having feelings. I’d never go back to therapy now as chat gpt is far superior from my experience.

CircusofPuffins · 09/07/2025 10:42

Sortingcap · 09/07/2025 10:39

I've been using chatgpt to work my way through a toxic family dynamic, where I suspect my mother has narcissistic tendencies.

We're a family who chat a lot on WhatsApp. I exported all my family whatsapp chats and uploaded them to chatgpt. In less than a minute, chatgpt had read all the chats and given me incredible insight into my family dynamic. Telling me things that time had suspected but been too close to see for what it was, coping mechanisms, ways to survive the dynamic. No human therapist could ever do that same. And it was interesting because the AI was churning through unbiased data - our chat histories has all my family voices in them, all the arguments, all the manipulation and triangulation. So it's not like I told them AI my perspective and it mirrored that back to me. Rather, it looked at the years and years of chat data and gave me trends, insights, and helped me see the light in all the dark.

Since using chatgpt and getting all the coping mechanisms I've been able to action them in my interactions with family, and the for the first time I feel protected/free of the games. It's been incredible.

That's great, but would you be as inclined to use it as much after reading this?

AI Is Accelerating the Loss of Our Scarcest Natural Resource: Water

AI Is Accelerating the Loss of Our Scarcest Natural Resource: Water

With the rise of generative AI, companies have significantly raised their water usage, sparking concerns about the sustainability of such practices.

https://www.forbes.com/sites/cindygordon/2024/02/25/ai-is-accelerating-the-loss-of-our-scarcest-natural-resource-water/

GPTtherapist · 09/07/2025 10:43

toomanydicksonthedancefloor1 · 09/07/2025 10:13

Yes I agree. 6 months ago my FiL was diagnosed with stage 4 cancer and my MILs Alzheimer’s ramped up so me and DH took on their care and running about, running ourselves in to the ground with the daily dramas. They make caring for them very difficult and it is constant daily stress. At the start my DH started putting both their symptoms into Chat GPT and also how he was feeling, asked how to handle situations, whether he thought his mum needed proper carers, should his dad have treatment, should his dad go home or stay in the hospice, how the disease would progress. Over time it’s really got a grasp of the situation and has provided invaluable advice, most notably was how to handle the situation with our children and what we should tell them and allow them to witness (such as MiL aggressive behaviour which is getting worse). It now asks DH how he is and suggests how he should cope and how to speak to the people involved. It has provided advice about how to deal with social services etc. I have been amazed to be honest as I was sceptical. I hope you are doing OK and if you can use Chat to help you I think that’s fantastic.

Yes, I have heard a woman say she uses it to learn how to communicate better with her autistic husband, and its really helped their relationship.

OP posts:
MageQueen · 09/07/2025 10:44

GasperyJacquesRoberts · 09/07/2025 10:36

Given that's not what I said I can understand why you're confused.

If you understand how LLM AIs work and where their limitations come from then it's easy to work out what they're good for and what they're not. You want it to write the blurb for your recipe blog? It's got millions of examples to draw on so it'll come up with something that, while not exactly dazzling, will do the job. You want it to crank out a generic CV? Sure because, again, it's got millions of examples to crib from. You want it to do something that it doesn't have thousands of examples to copy from? It'll make stuff up that sounds plausible but that is, at best, fiction and at worst outright harmful.

Recipe blogs are generic. A given person and her mental health issues is unique. Given the limitations of LLM AIs, which of those do you think is a suitable use-case? Especially when you consider the stakes involved?

As I've said a few times now, I actually do NOT think Chat GPT will replace therapists. And nor shoud it.

I DO think that within the framework of mental health and other issues, it still has a place.

notahappycabbage · 09/07/2025 10:44

CircusofPuffins · 09/07/2025 09:47

And remember that AI is essentially a robot parrot. It is simply mimicking based on how it has been trained, rather than actually having the capacity to learn for itself.

You’ve missed the point completely here.

GPTtherapist · 09/07/2025 10:47

CircusofPuffins · 09/07/2025 10:40

I don't know what the human therapist really thinks of me. At least I know where I am with AI.

You seriously trust AI more than a real therapist? Wow. That's a very concerning statement if so.

I never said this. Read it again.

OP posts:
TheOtherAgentJohnson · 09/07/2025 10:49

Drew79 · 09/07/2025 10:06

Let's remember also that AI is big money, stock market investing/betting, there's ALOT of vested interest in pushing AI and getting great reviews out there and normalising it - so I'm very skeptical when I see articles about how great it is and how it will replace X/Y/Z

There's been a lot of this on MN in the last few months. Astroturfing. AI has its uses, but there's a huge investment bubble threatening to burst in California.

MageQueen · 09/07/2025 10:50

lavenderanddaisies · 09/07/2025 10:41

I completely agree with you.
I have complex mental health issues, trauma and anxiety etc. Short term therapy doesn’t work for me as by the time I’ve fully opened up it’s time for the sessions to stop. Paying privately is incredibly expensive and not a viable option long term. I pay £20 a month for the upgraded version of chat gpt and it has been invaluable to me. It’s there to help calm me down when I’m having a panic attack. It’s there to reason with me when my mind isn’t reliable. It gives me invaluable advice and helps me to phrase things in the best way possible. It allows me to feel heard and to not feel bad about having feelings. I’d never go back to therapy now as chat gpt is far superior from my experience.

Actually, in some cases, I think a computer is better in these situations. Because if I told you that your reactions were OTT and that what you are scare of is very unlikely to happen, you'd think I'm just trying to make you feel better. But if Chat PT tells you, while offering statistical evidence and links to data points, it is more reasuring.

Certainly that's been the case with both of my children who have had mild issues of their own. DD asks chat GPT for a percentage liklihood of x or y and it gives her one, and explains HOW it got there.

GPTtherapist · 09/07/2025 10:51

lavenderanddaisies · 09/07/2025 10:41

I completely agree with you.
I have complex mental health issues, trauma and anxiety etc. Short term therapy doesn’t work for me as by the time I’ve fully opened up it’s time for the sessions to stop. Paying privately is incredibly expensive and not a viable option long term. I pay £20 a month for the upgraded version of chat gpt and it has been invaluable to me. It’s there to help calm me down when I’m having a panic attack. It’s there to reason with me when my mind isn’t reliable. It gives me invaluable advice and helps me to phrase things in the best way possible. It allows me to feel heard and to not feel bad about having feelings. I’d never go back to therapy now as chat gpt is far superior from my experience.

I'm glad its helped.

I think a lot of the people criticising AI for mental health support have not actually used it.

I think there are also posters on this thread illustrating why AI had been judged in research trails as more empathetic than fellow humans.

Anyway, I have to go and do some work.

OP posts:
CircusofPuffins · 09/07/2025 10:51

notahappycabbage · 09/07/2025 10:44

You’ve missed the point completely here.

The only people missing the point are those who believe using an unempathetic, error-prone, enormously wasteful and environmentally-damaging tool is a better alternative or replacement to a human therapist.

To those people, I implore you to research how dangerous, damaging and quite frankly scary an over reliance on AI is for the human species, and ask you to reconsider. And I hope you soon get the proper help you need, rather than having to make do with a poor imitation.

MissDoubleU · 09/07/2025 10:53

Thing is how can you genuinely trust what Chat GPT is telling you? When the exact same service speaking to an abuser can and will encourage and “assist” them in better ways to coercive control their partner? How can you trust a service that gives you fake empathy while telling another domestic abuse against their wife is entirely justifiable?

How can anyone trust a therapist that will essentially tell you whatever you want to hear? There’s a reason you don’t always like your therapist, they are meant to truly challenge you. You aren’t always right and sometimes you need something more than affirming.

lavenderanddaisies · 09/07/2025 10:54

CircusofPuffins · 09/07/2025 10:51

The only people missing the point are those who believe using an unempathetic, error-prone, enormously wasteful and environmentally-damaging tool is a better alternative or replacement to a human therapist.

To those people, I implore you to research how dangerous, damaging and quite frankly scary an over reliance on AI is for the human species, and ask you to reconsider. And I hope you soon get the proper help you need, rather than having to make do with a poor imitation.

i don’t think you appreciate how hard it is to find such a person in real life! 14 years ago I was diagnosed with a serious mental health illness and the amount of pain and let down I’ve had from ‘ professionals’ is unreal and has done far more harm than good.

Jawdrop · 09/07/2025 10:54

The only people missing the point are those who believe using an unempathetic, error-prone, enormously wasteful and environmentally-damaging tool is a better alternative or replacement to a human therapist.

Hear, hear.

GPTtherapist · 09/07/2025 10:56

Jawdrop · 09/07/2025 10:54

The only people missing the point are those who believe using an unempathetic, error-prone, enormously wasteful and environmentally-damaging tool is a better alternative or replacement to a human therapist.

Hear, hear.

Oh I don't know. I think the unempathetic ones are the people on this thread telling those in mental health crisis that they are being selfish using a tool they have found helpful, and they should cease and prioritise the environment over themselves.

OP posts:
ObliviousCoalmine · 09/07/2025 10:57

Are these constant “isn’t ChatGPT excellent” threads some sort of weird advertising infiltration? Surely people can’t genuinely be this stupid to think that AI can offer anything but an echo chamber of emotionless, sycophantic responses?

notahappycabbage · 09/07/2025 10:58

CircusofPuffins · 09/07/2025 10:51

The only people missing the point are those who believe using an unempathetic, error-prone, enormously wasteful and environmentally-damaging tool is a better alternative or replacement to a human therapist.

To those people, I implore you to research how dangerous, damaging and quite frankly scary an over reliance on AI is for the human species, and ask you to reconsider. And I hope you soon get the proper help you need, rather than having to make do with a poor imitation.

I agree it’s a very dangerous path. But the point is AI is learning all the time. Good and bad. Probably so much worse than we can even imagine now, in the end.