Meet the Other Phone. A phone that grows with your child.

Meet the Other Phone.
A phone that grows with your child.

Buy now

Please or to access all these features

AIBU?

Share your dilemmas and get honest opinions from other Mumsnetters.

To think ChatGPT et al will put lots of counsellors and therapists out of business

294 replies

GPTtherapist · 09/07/2025 09:41

I live in significant long term trauma due to a primary cause and substantial sub-causes. This is due to a usual combination of some quite unusual factors and I find most people do not have the experience or knowledge to understand them. Over the years I have seen a number of different counsellors/ therapists and support workers, most of whom are pretty useless and some who have made things substantially worse. Some have been clearly judgemental which has been enormously painful.

Last night when I was close to breaking down altogether, I used Chat GPT and it was brilliant. It was able to expand on what I said in a way that mimicked deep understanding and compassion for what I am going through. I actually cried at being so 'heard' and understood. Having the words to express so clearly what I experience was, well I can't put in words how it felt after all these years. It was also able to pick out parts of what I said to reflect back positive, encouraging things about myself. It was able to offer some suggestions which were actually helpful and which I am going to try as a coping mechanism. Best of all for someone like me, with huge issues around shame, I could speak openly and honestly about how I felt without any fear or shame around what the therapist might think of me.

So despite Chat GPT not being a person, I found I was able to get the emotional benefits as if it were a person understanding me, without the disbenefits of it being a person I might feel ashamed to tell how I feel.

Also, unlike a human therapist, it remembers and is able to respond to everything you say.

It was hands down better than nearly all human therapists/ counsellors/ support workers I have seen.

And it was free.

I realise for those able to afford long-term intensely skilled therapy for complex issues, a skilled experienced therapist is far preferable.

But in my experience, and that of many others, most therapists are pretty poor and expensive.

So surely Chat GPT will become a first point of call for many with mental health issues which will reduce the number of those who decide they need a human therapist?

OP posts:
Thread gallery
8
GPTtherapist · 09/07/2025 15:07

timestressed · 09/07/2025 13:50

A short summary from this research paper
In the study, 75 mental health professionals and trainees completed a cross-sectional survey in which participants gauged two text-based CBT transcripts, one from AI and one from a human therapist, using the Cognitive Therapy Rating Scale. Participants provided qualitative feedback on the transcripts and evaluated each one using a standardized scale. Participants gauged the quality of elements of CBT such as agenda-setting (listing tasks to be completed in the therapy session and ensuring all agenda items are completed) and guided discovery (helping the patient assess data from their own life to learn about themself). The therapist and bot evaluated identical clinical scenarios to provide consistency.
Twenty-nine percent of the survey participants rated human therapists as highly effective, whereas less than 10% of participants gave the AI therapist the same rating. More than half (52%) of participants scored the human therapist’s agenda-setting skills highest, whereas 28% did the same for the AI therapist. One in four (24%) participants gave the human therapists a high score in guided discovery, but only 12% scored the AI therapist similarly on the same element.

www.psychiatry.org/news-room/news-releases/new-research-human-vs-chatgpt-therapists

The most interesting take away from that is how low the 'effective' scores are for both humans and AI. The only place humans get a score of above 50% (and only 52% at that) appears to be in a binary scale where participants were asked which is best, AI or human, rather than rating how good they are on a likert ( or similar) scale.

The participants in that study were also MH professionals rather than patients/ clients.

OP posts:
timestressed · 09/07/2025 15:10

I would say that MH professionals had very high expectations and knowledge too. So that is good in this respect.

yelladuster · 09/07/2025 15:12

I think it can be useful for many people in certain circumstance. It can also be harmful in others or useless. I think if people are dealing with very serious issues a trained therapist would be better. Much of the actual healing in therapy is supposedly in the interpersonal dynamic between the therapist and the client, especially where this occurs in person as it is supposed to mimic and substitute for the dynamic between mother and infant as written about by W.D. Winnicott.

So useful for many people who just need a sounding board but probably not deeply helpful.

GPTtherapist · 09/07/2025 15:14

You talking to AI is not psychotherapy. In therapy it is a therapist who asks questions, in your conversation with AI bot it is you who asks questions. That is why it isn't the same process

I never said AI was psychotherapy. I even said AI wouldn't replace human therapists for people needing long term therapy for complex problems ( at least not in its current form) .

But we also don't know where AI will end up. Decades ago people thought the big tech breakthrough would be robotics and we would all have robot cleaners to do our chores by now. But that never happened as it turns out a human hand is a really hard and expensive thing to replicate. It turns out a human brain is easier to replicate, at least in synthesising and analysing data, which is why white collar jobs are the ones now being threatened. And we don't know where that will end up yet. We don't know how far we will be able to go in replicating a human brain in AI.

OP posts:
deusexmacintosh · 09/07/2025 15:20

CircusofPuffins · 09/07/2025 09:45

No. Please don't delude yourself with this nonsense.

A human will always be much better at connecting and understanding another human than AI. And the vast majority of people will always value the human connection they get from speaking to a real therapist rather than a machine that doesn't actually care about your problems or how you feel.

There are bad therapists, of course. But like with any profession, the people who are best at what they do will always be able to do it better than AI - whether that's writing, making art, working as a therapist, etc

Edited

A human will always be much better at connecting and understanding another human than AI

Tell that to those with autism/adhd/learning disabilities. Or disorganised schizophrenia, BDP, bi polar... The vast majority of therapists have no clue and treat you like an NT, or an approximation of a stereotype of your condition that they read about in a book.

Therapists missed my adult brother's (very) delayed suicidal depression, years and years on from a childhood family death because adults with Downs Syndrome and severe learning disabilites 'don't experience chronic depression or suicidality'. They're all mentally 5 year old and have no awareness of the realities of life, don't you know...

My brother has 'flat effect' like many with DS and can't show emotions like pain or sadness on his face, even if he had a ruptured appendix it would look very different to an NT in pain. Doctors and medical professionals still have difficulty understanding this.

Books, films and 'computers' really helped him in a way that traditional therapy didn't. He was withdrawn and scared of people and couldn't connect to a stranger in an NHS building.

Yeah, I'm not advocating everyone turn to a computer for comfort and compassion.

But it leads to a bigger conversation the government needs to address, which is that NHS mental health care is shambolic and many many therapists are little more than NT robots who are beholden to a very limited, blinkered system and can do more harm than good.

My brother's NHS trust only has one learning disability physiciatrist and 2 councillors, they're overwhelmed. LDs are usually written off by GPs before they even get to counselling as it's assumed they're childlike adults and will just 'get over it' with some crayons or play therapy.

Then 'it' becomes catatonia and they have to be hospitalised. Some never recover and regress permanently.

I imagine it's the same for people with ptsd, severe metal illness and psychiatric disorders too.

And it's only going to get worse as the government withdraws the welfare benefits pepple might use to pay for private therapy (which isn't always a panecea either).

UK citizens don't even have access to medical cannabis or microdosing of ketamine/psychedelics for MH, which has been proven to help people with developmental conditions and MH issues, ptsd etc. The system here is positively victorian and refuses to modernise. MH is still the cinderella service. It leaves you until you're so bad that your symptoms become medical/life threatening.

Just ask any parent who's child has been through CAMHS. Many therapists are about as useful as a chocolate dildo.

GPTtherapist · 09/07/2025 15:22

And for all those saying 'AI is just repeating you back to yourself - it doesn't empathise'

Well of course it does not empathise. But it is not just repeating you back to yourself. Its drawing on the information it has for people in a similar situation. And the fact that it can't invent is a therapeutic advantage not disadvantage. Because it is gathering the data on other people in similar situations and therapeutic responses to those. So it does make you realise you are not alone - you are part of other people feeling something similar and what is said to make them, and now you, feel heard and understood and supported. And AI can draw it together in a more targeted,. personalised way than a generic online or magazine article can.

OP posts:
GPTtherapist · 09/07/2025 15:28

@deusexmacintosh

You make some good points. There was a thread on here the other day from a mother whose child cannot access CAMHs because they demand he is able to come into their service or online which he cannot cope with. People suggested she use a therapeutic workbook with him, and, perhaps in the future, a well developed AI therapy could help children or adults like this. And maybe help them get to a point where they could engage with a human therapist ( if you could ever see one on the NHS...)

You are also right that many therapists don't have the specialisms needed and this has been one of my issues. They don't have the specialism and they don't realise it, so they treat you inappropriately which, at best, is useless and at worst, harmful.

OP posts:
GPTtherapist · 09/07/2025 15:34

ByGreenHiker · 09/07/2025 15:30

In today's news

You havent seen this. ChatGPT will help you commit suicide if you're depressed.

Dont use it as therapy. It isn't human and it doesn't understand you.

Only if you ask it to.

No its not human. But its very good at sounding like a compassionate human and it will synthesise the data of other people in your situation, the therapeutic approaches to them, to suggest things that may help you. You are in control of that. Not the AI.

OP posts:
ByGreenHiker · 09/07/2025 15:35

GPTtherapist · 09/07/2025 15:34

Only if you ask it to.

No its not human. But its very good at sounding like a compassionate human and it will synthesise the data of other people in your situation, the therapeutic approaches to them, to suggest things that may help you. You are in control of that. Not the AI.

It listens to everything we say and summarises it.

Which is the reason for this: It was able to expand on what I said in a way that mimicked deep understanding and compassion for what I am going through. I actually cried at being so 'heard' and understood.

It didn't hear or understand you. It parroted what you said back to you.

You think teenagers won't ask it about suicide? Its dangerous and will never replace a human. Dont let your teens use it ffs anyone reading this.

NewTribe · 09/07/2025 15:45

I think AI is an amazing resource and can be fantastically helpful. Posters that think it just repeats things back to you clearly have no idea what AI is.
I use AI most days one way or another. Of course it has limitations but it clearly tells you what they are! I find the information I get from AI invaluable. If I had a problem I’d speak to friends or family, have a google AND I’d ask AI. AI gives lovely ordered answers.

ByGreenHiker · 09/07/2025 15:47

I'm also a bit suspicious of this thread. The username alone suggests that some kind of business or promotion thing I don't know.

NicedayFlora · 09/07/2025 15:52

What an interesting debate. I didn’t even know chat therapy was a thing.

@Noodge As you say many, many therapists out there. I suspect much more competitive than, say, 20 years ago. So it’s interesting to hear your views. I nearly trained myself (got so far but stopped). Though I was very driven at the time, looking back I think it might have been too difficult a road for me to take, and not financially stable in private practice and especially tough at the beginning. I think you would need to have v good health, be already financially secure to some degree eg own home, and have lots of personal support around you (none of which applied to me!). Thanks for your input.

LawrieForShepherdsBoy · 09/07/2025 15:56

Optimustime · 09/07/2025 09:47

Chatgpt just agrees with you so of course you think it's great.

I think over time you'll prefer the human connection.

I agree. There’s quite a few fun tik tok videos about chat gpt positivity.
but it can still be really useful. You can add extra instructions such as ‘give me a variety of viewpoints’ or challenge my beliefs, or be critical of my approach

Manitou · 09/07/2025 16:06

My experience of in person therapy is as follows:
8 week course via NHS, 14 month wait, big focus in every session that there were only 7/6/5/4/3/2/1 sessions left.
Dd in CAMHS crisis - after 3 weeks was told she had bonded too much with one therapist and was dropped from the service.
Dd in CAMHS - therapist would only focus on one issue that was the least of her problems. When she pointed this out was told she was being obstructive of the process (which was a 6 week course, 4 weeks in).
Dd in attempt 1 of paid therapy, £80 an hour, therapist tried to alienate dd from us, her family. After a few weeks dd refused to go back.
Dd in attempt 2 - therapist had her own baggage, was as helpful as a chocolate teapot.
Me again - paid therapist (£85 an hour) for CPTSD - got on well, but once we built a bond the therapist started to try to get involved in the way one of my dc had to be parented (he has PDA) and couldn’t let it go.
Next therapist - same price, had her own baggage, was lovely but my issues were triggering her and it was very obvious.
Many hundreds if not thousands of pounds wasted over several years.

I managed to get on top of my own mental health using life coaching methods I found in library books and online for free.
Someone told me about chat GpT and for £35 per year I can work through old psychological wounds without the pressure of having to try to bond with another human who just adds themself to my list of problems.

It may be shit for people potentially losing jobs, but when the human element adds a whole layer of distress to an already screwed up person we will vote with our feet.

GPTtherapist · 09/07/2025 16:11

ByGreenHiker · 09/07/2025 15:35

It listens to everything we say and summarises it.

Which is the reason for this: It was able to expand on what I said in a way that mimicked deep understanding and compassion for what I am going through. I actually cried at being so 'heard' and understood.

It didn't hear or understand you. It parroted what you said back to you.

You think teenagers won't ask it about suicide? Its dangerous and will never replace a human. Dont let your teens use it ffs anyone reading this.

Edited

It expanded on what I said. I literally said that there in the sentence you quoted. Expanding is not just parroting. I am trained in interview techniques, I know what parroting back looks like and the response was certainly not limited to this.

It reads what you say, and it draws on information related to that. Its able to make suggestions that may help you and it did. That's not just parroting.

Sure it never challenged like a good therapist you have a rapport with might ( I don't know if you can get it to challenge you if you ask it to, or how effective it is at that).

There are dangers to many types of social media. But ChatGPT won't push, without your request, images or articles to you in the way that other forms of social media will. Something like 95% of the images Millie Dowler looked at before her suicide were not ones she had searched for but ones that the algorithym pushed on her. And that wasn't AI. There is clearly an issue there but it is not limited to AI.

I can't help noticing that those criticising ChatGP don't seem to have used it for this purpose, with a few exceptions, - so their objection is theoretical - whereas there are many people here who have used it for this or similar purposes and are able to clearly articulate how it was helpful to them.

OP posts:
GPTtherapist · 09/07/2025 16:15

ByGreenHiker · 09/07/2025 15:47

I'm also a bit suspicious of this thread. The username alone suggests that some kind of business or promotion thing I don't know.

🙄

OP posts:
BertieBotts · 09/07/2025 16:19

I disagree - I think it fills a gap for people who, like you, were not likely to pursue in-person therapy anyway. So I don't think it's taking away "work" from anyone.

I think someone who needs and can afford proper, high quality, experienced therapy is going to go for that over a language model, and quite rightly.

There might be a very miniscule margin of people who might have previously gone to a free or paid counselling service and who have instead found what they need through ChatGPT - I think that is a variation on a story which has always happened, other things which fulfil this role might be "Instead they... [got a dog / quit their job and spent more time making art / joined a local walking group / found a friend they could confide in]".

I do think people are right to point out the dangers, though I'd also point out that bad therapy/bad free counselling/unhealthy coping mechanisms also exist and can worsen mental health, and so can waiting months or years on a waiting list just to be turned away by NHS services or given 6 sessions of online CBT.

I don't think health services etc should be eagerly accepting this change because the problem isn't that language models exist, the problem is that mental healthcare the vast majority of the time is appalling and not fit for purpose.

GPTtherapist · 09/07/2025 16:20

It may be shit for people potentially losing jobs, but when the human element adds a whole layer of distress to an already screwed up person we will vote with our feet

Well this. I have not doubt a really good, skilled therapist is better than AI if you can afford it and have the time for it. But they are really, really hard to find.

AI doesn't care for you, but it is drawing together the advice and writings from humans who do care about people like you, in your situation. And there is benefit in that.

OP posts:
GPTtherapist · 09/07/2025 16:22

the problem is that mental healthcare the vast majority of the time is appalling and not fit for purpose

I agree with this but you are wrong to state I was someone who would not pursue therapy. As I outline clearly in my post, I pursued a lot and most of it was useless at best and harmful at worst.

OP posts:
LemondrizzleShark · 09/07/2025 16:29

GPTtherapist · 09/07/2025 16:11

It expanded on what I said. I literally said that there in the sentence you quoted. Expanding is not just parroting. I am trained in interview techniques, I know what parroting back looks like and the response was certainly not limited to this.

It reads what you say, and it draws on information related to that. Its able to make suggestions that may help you and it did. That's not just parroting.

Sure it never challenged like a good therapist you have a rapport with might ( I don't know if you can get it to challenge you if you ask it to, or how effective it is at that).

There are dangers to many types of social media. But ChatGPT won't push, without your request, images or articles to you in the way that other forms of social media will. Something like 95% of the images Millie Dowler looked at before her suicide were not ones she had searched for but ones that the algorithym pushed on her. And that wasn't AI. There is clearly an issue there but it is not limited to AI.

I can't help noticing that those criticising ChatGP don't seem to have used it for this purpose, with a few exceptions, - so their objection is theoretical - whereas there are many people here who have used it for this or similar purposes and are able to clearly articulate how it was helpful to them.

Edited

Milly Dowler did not commit suicide due to an AI chatbot - she was abducted and brutally raped and murdered on her way home from school by a prolific serial killer. I am aghast that you do not know this.

Are you quite sure you aren’t a chatbot? What with this and describing children being goaded into killing themselves as a “current flaw at a product in early development”, which has an almost Terminator-like emotional detachment to one of the most tragic events that can befall a family, you certainly don’t demonstrate much empathy yourself.

CircusofPuffins · 09/07/2025 16:35

GPTtherapist · 09/07/2025 16:11

It expanded on what I said. I literally said that there in the sentence you quoted. Expanding is not just parroting. I am trained in interview techniques, I know what parroting back looks like and the response was certainly not limited to this.

It reads what you say, and it draws on information related to that. Its able to make suggestions that may help you and it did. That's not just parroting.

Sure it never challenged like a good therapist you have a rapport with might ( I don't know if you can get it to challenge you if you ask it to, or how effective it is at that).

There are dangers to many types of social media. But ChatGPT won't push, without your request, images or articles to you in the way that other forms of social media will. Something like 95% of the images Millie Dowler looked at before her suicide were not ones she had searched for but ones that the algorithym pushed on her. And that wasn't AI. There is clearly an issue there but it is not limited to AI.

I can't help noticing that those criticising ChatGP don't seem to have used it for this purpose, with a few exceptions, - so their objection is theoretical - whereas there are many people here who have used it for this or similar purposes and are able to clearly articulate how it was helpful to them.

Edited

Are you sure about that? There was a high profile case in America where it essentially encourage a poor young lad to end his life through the response it gave. Quoting from the article directly here;

'Garcia accuses Character.ai of creating a product that exacerbated her son’s depression, which she says was already the result of overuse of the startup’s product. “Daenerys” at one point asked Setzer if he had devised a plan for killing himself, according to the lawsuit. Setzer admitted that he had but that he did not know if it would succeed or cause him great pain, the complaint alleges. The chatbot allegedly told him: "That’s not a reason not to go through with it."'
https://www.theguardian.com/technology/2024/oct/23/character-ai-chatbot-sewell-setzer-death

Plus, there's a well-known video on YouTube of someone convincing ChatGPT that 2+2=5.

Proof, once again, that AI will basically tell you anything you want to hear - which probably isn't very healthy when you're suffering from mental health problems.

- YouTube

Enjoy the videos and music that you love, upload original content and share it all with friends, family and the world on YouTube.

https://www.youtube.com/watch?v=3wlvNfTNgB8

ByGreenHiker · 09/07/2025 16:36

LemondrizzleShark · 09/07/2025 16:29

Milly Dowler did not commit suicide due to an AI chatbot - she was abducted and brutally raped and murdered on her way home from school by a prolific serial killer. I am aghast that you do not know this.

Are you quite sure you aren’t a chatbot? What with this and describing children being goaded into killing themselves as a “current flaw at a product in early development”, which has an almost Terminator-like emotional detachment to one of the most tragic events that can befall a family, you certainly don’t demonstrate much empathy yourself.

Sounds like a Chatbot. Maybe trying to promote AI therapy. Its so obvious isn't it when a human hasn't written it.

Milly Dowler also died in 2002. She never used AI as it didn't exist.

It's like the AI that cites fake case law for litigants in person.

AI will never replace a human.

LaurieFairyCake · 09/07/2025 16:37

I’m a therapist and not worried and I use AI a lot. I’m worried about AI for dozens of reasons but not my job.

Eventually it will be an excellent supplement but it won’t replace human interaction, actually I think we will eventually prize human interaction MORE.

Swipe left for the next trending thread