Meet the Other Phone. Only the apps you allow.

Meet the Other Phone.
Only the apps you allow.

Buy now

Please or to access all these features

AIBU?

Share your dilemmas and get honest opinions from other Mumsnetters.

To think ChatGPT et al will put lots of counsellors and therapists out of business

294 replies

GPTtherapist · 09/07/2025 09:41

I live in significant long term trauma due to a primary cause and substantial sub-causes. This is due to a usual combination of some quite unusual factors and I find most people do not have the experience or knowledge to understand them. Over the years I have seen a number of different counsellors/ therapists and support workers, most of whom are pretty useless and some who have made things substantially worse. Some have been clearly judgemental which has been enormously painful.

Last night when I was close to breaking down altogether, I used Chat GPT and it was brilliant. It was able to expand on what I said in a way that mimicked deep understanding and compassion for what I am going through. I actually cried at being so 'heard' and understood. Having the words to express so clearly what I experience was, well I can't put in words how it felt after all these years. It was also able to pick out parts of what I said to reflect back positive, encouraging things about myself. It was able to offer some suggestions which were actually helpful and which I am going to try as a coping mechanism. Best of all for someone like me, with huge issues around shame, I could speak openly and honestly about how I felt without any fear or shame around what the therapist might think of me.

So despite Chat GPT not being a person, I found I was able to get the emotional benefits as if it were a person understanding me, without the disbenefits of it being a person I might feel ashamed to tell how I feel.

Also, unlike a human therapist, it remembers and is able to respond to everything you say.

It was hands down better than nearly all human therapists/ counsellors/ support workers I have seen.

And it was free.

I realise for those able to afford long-term intensely skilled therapy for complex issues, a skilled experienced therapist is far preferable.

But in my experience, and that of many others, most therapists are pretty poor and expensive.

So surely Chat GPT will become a first point of call for many with mental health issues which will reduce the number of those who decide they need a human therapist?

OP posts:
Thread gallery
8
Noodge · 09/07/2025 10:02

iloveeverykindofcat · 09/07/2025 09:56

I was just talking about this on another thread. I'm a sociologist and I'm really concerned with its potential impact on the way humans relate to each other. People think it listens, but it doesn't listen. It affirms what you say and confirms your biases and perceptions. If that's what people want from it...I don't know. Maybe that's helpful. I don't want to criticise anyone who thinks its supporting them, particularly if they don't have other options. But its definitely not listening or therapy.

Edited

My two degrees are in sociology and I would love to get back into working as one given all of my above post. But I can't find a way in after so much time out Sad

CircusofPuffins · 09/07/2025 10:02

MageQueen · 09/07/2025 09:56

You're kind of missing the point about Ai completely. It learns ALL the time. That's the point. Yes, it's not genuinely empathetic etc, but it's continous learning means that it's outputs, comments, insights etc are better all the time.

But why would you want to talk to something that has no empathy? And something that is not medically recognised as proper treatment for people who are genuinely suffering with their mental health, and which is probably doing more harm than good in the long run?

LovingLimePeer · 09/07/2025 10:03

I'm so glad that it worked for you when you felt in crisis, but no, I think there will still be a need for therapists.

A lot of the value of therapy and counselling is in the human element, in the judicious use of silence, in the spaces where people can deeply reflect on their experiences and thoughts. If I was in need of help (maybe excluding an acute crisis), I would choose a therapist for the reasons above. I find chatGPT responses sometimes formulaic and inhuman (personal view). ChatGPT won't pick up on body language/visual cues. AI may have a place for low level support e.g. online CBT but I can't see it being so widely used for more intensive treatments.

stayathomer · 09/07/2025 10:04

Chatgpt and technology can put pretty much everyone out of business if people decide to do a diy job, the thing is I think there’s a huge difference between a face to face interaction where you read people’s reactions and feel their empathy as opposed to something like chatgpt. Alsoo ok the physical act of leaving the house, sitting down with a person as opposed to a further isolation by just typing is huge

Drew79 · 09/07/2025 10:06

Let's remember also that AI is big money, stock market investing/betting, there's ALOT of vested interest in pushing AI and getting great reviews out there and normalising it - so I'm very skeptical when I see articles about how great it is and how it will replace X/Y/Z

MissDoubleU · 09/07/2025 10:06

Many people using chat gpt for therapy are being pushed further into psychosis and mania. I’ve seen people falling deep into spiritual delusions.

It’s a dangerous trend. I don’t agree with AI use for this sort of thing or in fact most things.

LemondrizzleShark · 09/07/2025 10:07

GPTtherapist · 09/07/2025 09:55

But what it is drawing on is the vast knowledge of therapeutic approaches. It actually used the same analogy one of the good therapists that I had used. She was trained in that too, just like Chat GPT.

And of course a good relationship is key to successful therapy. But Chat GPT was really, really good at mimicking that. its much better at mimicking it than nearly all human therapists ( ime) are able to deliver in real life. I realise a good therapist will challenge clients too ( though maybe Chat GPT can do that too if you ask, don't know) but you have to have established a strong relationship first to challenge a client successfully and too many therapists can't do that.

Also, if you do develop a strong relationship but cannot afford long term therapy, or rely, like me, on short term free therapy from charities etc, the pain when that relationship prematurely ends can be worse than having no therapy at all. But Chat GPT is free.

If your therapist ISN’T challenging you/helping you to reframe things, you are essentially paying £200 per hour for a cup of tea and pat on the head.

If that is all you want, fine, use a chatbot. But that is not therapy.

MageQueen · 09/07/2025 10:09

CircusofPuffins · 09/07/2025 10:02

But why would you want to talk to something that has no empathy? And something that is not medically recognised as proper treatment for people who are genuinely suffering with their mental health, and which is probably doing more harm than good in the long run?

LIke all tools, and technology (including AI) is a tool, it's about the way it's used.
For myself, if I wanted proper therapy, I would 100% go to an actual therapist. I was the person upthread saying that actually, I do NOT think that AI will make therapists go out of business.

However, I think that this can be a very useful tool, used correctly and it's particularly useful for short term, minor things. I do concede however that there are many people who don't understand how to use tools like this, and that is more of a concern.

I have found it very helpful at times. And while I appreciate the "empathy" is not real, nonetheless the experience of it for me IS real. It has helped me to articulate and understand some of DS' sensory processing issues for example. A friend uses it for work coaching and support. DS, who was an overweight child and still remembers that even as he's a tall, slim, sporty teen told me that he uploaded a photo the other day becuase he didn't believe me that he was fit and healthy and in great shape.

GasperyJacquesRoberts · 09/07/2025 10:09

MageQueen · 09/07/2025 09:56

You're kind of missing the point about Ai completely. It learns ALL the time. That's the point. Yes, it's not genuinely empathetic etc, but it's continous learning means that it's outputs, comments, insights etc are better all the time.

How are you measuring "better"? Where's the feedback loop so that the AI can discern which of its outputs are good, and which ones are bad?

Talltreesbythelake · 09/07/2025 10:10

How can we even know that the OP is a human being who has really done this? There has been a similar thread for the past few weeks, each going over the same ground. I am suspicious that this is a marketing tool not a real attempt at conversion.

GPTtherapist · 09/07/2025 10:10

Jawdrop · 09/07/2025 10:00

I would suggest that you have had bad experiences of therapy and don't know much about it, which are almost certainly related.

I have had had bad experiences of therapy, as has almost everyone I know who has had therapy. Its really common.

People say AI is not empathetic. Neither are all therapists. They come with their own baggage and biases and ignorance of the issues clients have, whilst not realising they ignorant.

PP is right. AI is learning all the time. It was pretty good when I used it and it is in its infancy. It will get way better. I heard an expert in child development and learning speak and she is being employed by AI developers to teach them how children learn and how they learn experientially. AI is going to get better at learning.

I realise the limitations. I realise current AI is a consumer product often feeding back what it thinks its consumer wants to hear, so they keep consuming. But when there are a lot of crap expensive therapists, and its really good both at making people feel understood and offering helpful advice, is going to be a threat to the counselling industry.

OP posts:
GPTtherapist · 09/07/2025 10:11

GasperyJacquesRoberts · 09/07/2025 10:09

How are you measuring "better"? Where's the feedback loop so that the AI can discern which of its outputs are good, and which ones are bad?

The client is the feedback loop. Same as it would be for a human therapist.

OP posts:
mediummumma · 09/07/2025 10:12

I’m sorry that your previous therapeutic experiences didn’t provide you with what you need. Unfortunately human connection is complex; it takes time to build trust and each therapist is as unique and individual as each client so it can be challenging to find the right ‘fit’.

I think AI can be useful in lots of ways but it cannot ever be empathetic or accepting. It does not emote and it is not ethical. It draws on information, both correct and incorrect, and will misrepresent that information in order to please you. It’s an artificial connection that has a place in our lives, but it will not replace real acceptance, care and love offered by good therapists.

SummerShimmer · 09/07/2025 10:12

It doesn’t just reflect back what you said, it offers insights. I’ve been using it to ‘talk’ to about my recent cancer diagnosis and it’s been bloody brilliant. It’s made me feel much better about my way of coping and suggests things to try etc.

A previous poster said that it was just an algorithm - please look up the training for PWP in Talking Therapies services…

MageQueen · 09/07/2025 10:12

GasperyJacquesRoberts · 09/07/2025 10:09

How are you measuring "better"? Where's the feedback loop so that the AI can discern which of its outputs are good, and which ones are bad?

it gets feedback all the time from millions of users, and within your own chats with it, from you. That's the point.

I do get very frustrated with the broad anti AI rhetoric. What is far more important is for people to learn how to use it, and what it's limitations are. Just like when th einternet first came along, so many people thought it was all rubbish and would rot our brains etc, and we all had to learn how to use it. Of course, lots of people still haven't figured that out so I guess the chances are that AI will be the same.

But it is a powerful tool, improving constantly, and becoming more embedded constantly so we're all going to have to learn to adapt.

toomanydicksonthedancefloor1 · 09/07/2025 10:13

Yes I agree. 6 months ago my FiL was diagnosed with stage 4 cancer and my MILs Alzheimer’s ramped up so me and DH took on their care and running about, running ourselves in to the ground with the daily dramas. They make caring for them very difficult and it is constant daily stress. At the start my DH started putting both their symptoms into Chat GPT and also how he was feeling, asked how to handle situations, whether he thought his mum needed proper carers, should his dad have treatment, should his dad go home or stay in the hospice, how the disease would progress. Over time it’s really got a grasp of the situation and has provided invaluable advice, most notably was how to handle the situation with our children and what we should tell them and allow them to witness (such as MiL aggressive behaviour which is getting worse). It now asks DH how he is and suggests how he should cope and how to speak to the people involved. It has provided advice about how to deal with social services etc. I have been amazed to be honest as I was sceptical. I hope you are doing OK and if you can use Chat to help you I think that’s fantastic.

LemondrizzleShark · 09/07/2025 10:13

GPTtherapist · 09/07/2025 10:11

The client is the feedback loop. Same as it would be for a human therapist.

Which is how you end up with a pro-ana chatbot on the National Eating Disorders website.

Mentally ill people are not always the best judges of whether a chatbot’s advice is good or not.

GPTtherapist · 09/07/2025 10:13

Talltreesbythelake · 09/07/2025 10:10

How can we even know that the OP is a human being who has really done this? There has been a similar thread for the past few weeks, each going over the same ground. I am suspicious that this is a marketing tool not a real attempt at conversion.

I think you have just proved how good AI already is at mimicking real human behaviour and interactions.

I am a human being, but AI is so good at mimicking humans you think I am AI.

So yeah, talking therapists need to be worried if AI in its infancy is so good that people can't tell the difference between AI and a human.

OP posts:
StMarie4me · 09/07/2025 10:14

Even AI experts don’t trust AI. Why would you trust fragile mental health/ emotions to it?

Cadenza12 · 09/07/2025 10:15

I've used it too and it was amazing. It didn't just parrot back which is what you would have expected. I think that until you've tried it you really don't get just what it's capable of. It's scary too.

GPTtherapist · 09/07/2025 10:15

LemondrizzleShark · 09/07/2025 10:13

Which is how you end up with a pro-ana chatbot on the National Eating Disorders website.

Mentally ill people are not always the best judges of whether a chatbot’s advice is good or not.

That's a current flaw at a product in early development, yes.

But that can be designed out over time, so the AI combines professional knowledge and research with client feedback to give a more appropriate service.

OP posts:
99bottlesofkombucha · 09/07/2025 10:15

LemondrizzleShark · 09/07/2025 09:51

I have also heard stories of ChatGPT reflecting back people’s feeling of worthlessness/suicidality. Cheerfully agreeing that their families would be better off without them etc.

It is just a computer, it is good at making people feel listened to because it reflects back what is being said, but it doesn’t actually have any insights or objectivity.

this; and also chat gpt supporting people to feel that someone has screwed them over and really should be punished, or that tjay woman really did lead him on and deserves it. There will be an ai ‘therapy’ driven murder/harm as well as ai ‘therapy’ driven suicide. That’s what the affirming model does.

Ribecx · 09/07/2025 10:15

But do you really want big tech to have a written record of your deepest, innermost thoughts?

chachahide · 09/07/2025 10:16

I think a lot of people who go to therapy have attachment issues, meaning they were not securely attached to a caregiver in their childhood.

The point (as I understand it), is to have that attachment grow, to a human person, throughout the therapeutic process.

Chat gpt can’t sit with me and validate, authentically, and can’t build that human attachment and connection with me. It’s also about being accepted by another person, despite your faults. Which may not be happening elsewhere in your life.

I think it’s great if you can’t afford therapy and indeed for very specific things, maybe CBT? Which has a specific process. But for the therapy I get it just wouldn’t work.

GasperyJacquesRoberts · 09/07/2025 10:17

GPTtherapist · 09/07/2025 10:11

The client is the feedback loop. Same as it would be for a human therapist.

Really? So ChatGPT can tell when you're taking longer to answer because you're struggling? It can hear you start to choke up while talking about particularly difficult topics? It notices when you change the subject to avoid talking about something you don't want to talk about? It can spot when you start missing appointments and then call you to check up to see how you're doing?

Wow. AI must be more advanced than I thought.

Swipe left for the next trending thread