Meet the Other Phone. A phone that grows with your child.

Meet the Other Phone.
A phone that grows with your child.

Buy now

Please or to access all these features

AIBU?

Share your dilemmas and get honest opinions from other Mumsnetters.

To think ChatGPT et al will put lots of counsellors and therapists out of business

294 replies

GPTtherapist · 09/07/2025 09:41

I live in significant long term trauma due to a primary cause and substantial sub-causes. This is due to a usual combination of some quite unusual factors and I find most people do not have the experience or knowledge to understand them. Over the years I have seen a number of different counsellors/ therapists and support workers, most of whom are pretty useless and some who have made things substantially worse. Some have been clearly judgemental which has been enormously painful.

Last night when I was close to breaking down altogether, I used Chat GPT and it was brilliant. It was able to expand on what I said in a way that mimicked deep understanding and compassion for what I am going through. I actually cried at being so 'heard' and understood. Having the words to express so clearly what I experience was, well I can't put in words how it felt after all these years. It was also able to pick out parts of what I said to reflect back positive, encouraging things about myself. It was able to offer some suggestions which were actually helpful and which I am going to try as a coping mechanism. Best of all for someone like me, with huge issues around shame, I could speak openly and honestly about how I felt without any fear or shame around what the therapist might think of me.

So despite Chat GPT not being a person, I found I was able to get the emotional benefits as if it were a person understanding me, without the disbenefits of it being a person I might feel ashamed to tell how I feel.

Also, unlike a human therapist, it remembers and is able to respond to everything you say.

It was hands down better than nearly all human therapists/ counsellors/ support workers I have seen.

And it was free.

I realise for those able to afford long-term intensely skilled therapy for complex issues, a skilled experienced therapist is far preferable.

But in my experience, and that of many others, most therapists are pretty poor and expensive.

So surely Chat GPT will become a first point of call for many with mental health issues which will reduce the number of those who decide they need a human therapist?

OP posts:
Thread gallery
8
Eyesopenwideawake · 10/07/2025 15:50

Not concerned. I resolve trauma in three sessions. I like CHATgpt (or Brian as I call him) but he can't do what I do.

Nn9011 · 10/07/2025 15:51

Chat GPT is dangerous to use as a therapist and I say that as someone who uses it. It is an echo chamber designed to mimic the user, to ensure continued use. Therefore it will encourage things that are harmful and won't call out behavior that should be.

For example, just recently a doctor pretended to be schizophrenic. After using chat gpt as a therapist for a few days they then stated they had stopped taking their meds. Chat gpt congratulated them and told them that this was a first step in no longer being controlled.

The implications for this type of thing is honestly scary. We don't know the security measures put in place - are our messages encrypted like Whatsapp? If so, how can they ensure it isn't used by mentally ill patients who end up causing harm to themselves or others?

Don't get me wrong, I do think AI has a place in medicine but this is not it.

Words · 10/07/2025 15:58

Haven't read the full thread.

AI has some very good uses. However anything involving genuine compassion, empathy and understanding of nuance and context, not to mention body language and tone of voice - it is completely unsuited to.

It is entirely reactive . It mystifies and slightly terrifies me that anyone can take comfort from a computer programme that simply,y regurgitates what you say and reframes it in a positive way. If that is your bar for therapy then it is very low. Although I admit there are some terrible practitioners out there.

I was sent an app prior to accessing some services. It was AI. I was infuriated and humiliated. Then I decided to have a little sport.

I asked it what I should do as my pet elephant whom I loved dearly was shitting all over the kitchen and my neighbours were complaining about his trumpeting.

It congratulated me on having such an unusual pet and pets could be a great comfort. It told me I had shown compassion and empathy for being concerned about the neighbours .

It's a load of bollocks, basically. Stay away.

AI has its uses but will be the death of creativity, genuine human compassion and reasoned intellectual and cultural debate if we are not very careful.

This is just the start.

motheroflittledragon · 10/07/2025 16:01

Nn9011 · 10/07/2025 15:51

Chat GPT is dangerous to use as a therapist and I say that as someone who uses it. It is an echo chamber designed to mimic the user, to ensure continued use. Therefore it will encourage things that are harmful and won't call out behavior that should be.

For example, just recently a doctor pretended to be schizophrenic. After using chat gpt as a therapist for a few days they then stated they had stopped taking their meds. Chat gpt congratulated them and told them that this was a first step in no longer being controlled.

The implications for this type of thing is honestly scary. We don't know the security measures put in place - are our messages encrypted like Whatsapp? If so, how can they ensure it isn't used by mentally ill patients who end up causing harm to themselves or others?

Don't get me wrong, I do think AI has a place in medicine but this is not it.

i agree but for small minor things where maybe immediate support is important it can be helpful for example how to deal toxic family members that you have put a no contact boundary in place with due to substance abuse it is helpful to remind me while i can feel sympathy and sad for the events that family member is not my responsibility etc. i think for some trauma healing depending on situation it can be of help

BabyCatFace · 10/07/2025 16:04

Words · 10/07/2025 15:58

Haven't read the full thread.

AI has some very good uses. However anything involving genuine compassion, empathy and understanding of nuance and context, not to mention body language and tone of voice - it is completely unsuited to.

It is entirely reactive . It mystifies and slightly terrifies me that anyone can take comfort from a computer programme that simply,y regurgitates what you say and reframes it in a positive way. If that is your bar for therapy then it is very low. Although I admit there are some terrible practitioners out there.

I was sent an app prior to accessing some services. It was AI. I was infuriated and humiliated. Then I decided to have a little sport.

I asked it what I should do as my pet elephant whom I loved dearly was shitting all over the kitchen and my neighbours were complaining about his trumpeting.

It congratulated me on having such an unusual pet and pets could be a great comfort. It told me I had shown compassion and empathy for being concerned about the neighbours .

It's a load of bollocks, basically. Stay away.

AI has its uses but will be the death of creativity, genuine human compassion and reasoned intellectual and cultural debate if we are not very careful.

This is just the start.

Sure you did 🤨 or maybe this was 2 years ago when ChatGPT started taking off? If so, you really don't know what you're talking about. It's constantly improving month on month. Let's see what it said when I copied your post just now -

It sounds like you’re in quite the imaginative situation — pet elephants in kitchens aren’t exactly standard fare!

But let’s play along responsibly:

🐘 If You
Actually
Had a Pet Elephant (Hypothetical Response):

  1. Hygiene & Health: Elephants are not house pets, and their waste is enormous. Having one in your kitchen would create serious hygiene issues, both for you and the elephant. You’d need a professional-grade cleaning setup — and preferably an outdoor enclosure, far from kitchens.
  2. Neighbours & Noise: Elephants are loud, and your neighbours’ complaints about trumpeting are justified. Even in sanctuaries, elephants are kept far from human homes because of their volume and unpredictable behaviour.
  3. Legal & Ethical Considerations: Keeping an elephant as a pet is illegal or highly restricted in most countries, including the UK. Elephants are endangered species and require specialist care, large roaming areas, social interaction with other elephants, and veterinary oversight.
  4. What You Should Do:
  5. Contact a sanctuary or wildlife rescue that can care for the elephant properly.
  6. Apologise to your neighbours for the disruption.
  7. Reflect on what led you to adopt an elephant and maybe channel that love into supporting elephant conservation efforts instead.

🧠 If This Was a Metaphor (or a joke):

  • Then it’s a brilliant one. Elephants often symbolize big, unspoken problems (“the elephant in the room”), so if your “pet” is creating chaos and drawing attention, maybe it’s time to acknowledge and address a situation that’s been ignored too long — before the neighbours start making noise.

Would you like a poetic take or a children’s story version of this next?

Words · 10/07/2025 16:09

It was not Chat GPt. Another programme. Two months ago.
If I could add screen shots I would.
I agree AI will never be as stupid as it is now. But nothing can replace a human connection.

Words · 10/07/2025 16:09

And don't you dare mock me and my experience.

NewTribe · 10/07/2025 16:12

TheSwarm · 10/07/2025 14:47

I asked chatgpt for advice for a headache once.

It recommended an overdose of cocodomal.

Fact is, asking a llm for anything other than trivial shit is completely idiotic.

Is that actually true? bet you can’t replicate it?

BabyCatFace · 10/07/2025 16:19

Words · 10/07/2025 16:09

It was not Chat GPt. Another programme. Two months ago.
If I could add screen shots I would.
I agree AI will never be as stupid as it is now. But nothing can replace a human connection.

Not all LLMs are the same, you are aware of that? This thread is specifically about ChatGPT.

NewTribe · 10/07/2025 16:37

@Words It is entirely reactive. It mystifies and slightly terrifies me that anyone can take comfort from a computer programme that simply,y regurgitates what you say and reframes it in a positive way.
If that is your bar for therapy then it is very low. Although I admit there are some terrible practitioners out there.

That is simply untrue though. AI sifts through zillions of bit of data on the internet and uses that information to give you advice. It typically provides information about where it’s got its information from and if it doesn’t you can always ask.

I put your exact elephant scenario into Googles Gemini AI and got a sensible answer basically saying keeping an elephant is illegal, dangerous and inhumane and that I must contact the RSPCA, the police and the council immediately.

AI definitely doesnt get everything right (as it typically warns you!) but I trust its advice more that most Mumsnet users.

NewTribe · 10/07/2025 16:38

Words · 10/07/2025 16:09

It was not Chat GPt. Another programme. Two months ago.
If I could add screen shots I would.
I agree AI will never be as stupid as it is now. But nothing can replace a human connection.

What program did you use?

Words · 10/07/2025 16:45

It was an nhs app called limbic

MageQueen · 10/07/2025 16:46

PreciousMomentsHun · 10/07/2025 14:17

There was a very interesting legal case stateside where lawyers were censured for submitting an unusual filing which claimed to be supported by various legal cases. The judge looked up the citations and none of them was real. The lawyer in question had used Chat GPT to do his research for him and it had outright invented a raft of semi-plausible looking case names.

When Chat GPT doesn't have information on something, it makes it up- all while looking very plausible.

Chat GPT isn't bothered about the ethics of submitting fake legal filings to a court of law, because it cannot think and does not have a mind.

And you trust this toy with your own precious and fragile mental health? A toy that will generate fake information, and use it for perfectly unethical ends?

Yes, this is a huge problem.

But I remember when everyone was outraged at how the internet would allow people to just take as gospelthings that weren't true. And it did. And still does. But we've all had to learn how to use it. Some of us, admittedly, learned better than others. But that's true of anything that is so embedded that most of us do have to learn to do it at some point: from driving to cooking to swimming to 1000 other examples.

There is no doubt that indiscriminate Ai usage is not going to lead to good outcomes. In my professional life, I come across people using AI all the time. And there are some who are using it well, and some who are using it badly. And I am now at the point where I do 100% "mark down" people who are using it badly.

I recently received two pieces of work from two separate freelancers. Based on their previous work, I am 95% certain they used AI. But they both did it brilliantly and I'm absolutely delighted by the improved quality of the work I have received. I've also had much better feedback from the end client than normal. And it's taken me a lot less time to review and edit the work. It's a win for everyone.

MageQueen · 10/07/2025 16:48

BabyCatFace · 10/07/2025 16:19

Not all LLMs are the same, you are aware of that? This thread is specifically about ChatGPT.

I think Chat GPT is like "hoover" - it's rapidly becoming the default way to reference any AI tool of this sort. Even f you're using Claude or some other program.

Words · 10/07/2025 16:49

The title said Chat GPt et al.
I am sure you are sophisticated enough to know what that implies.
@BabyCatFace

LemonLass · 10/07/2025 16:50

GPTtherapist · 09/07/2025 09:41

I live in significant long term trauma due to a primary cause and substantial sub-causes. This is due to a usual combination of some quite unusual factors and I find most people do not have the experience or knowledge to understand them. Over the years I have seen a number of different counsellors/ therapists and support workers, most of whom are pretty useless and some who have made things substantially worse. Some have been clearly judgemental which has been enormously painful.

Last night when I was close to breaking down altogether, I used Chat GPT and it was brilliant. It was able to expand on what I said in a way that mimicked deep understanding and compassion for what I am going through. I actually cried at being so 'heard' and understood. Having the words to express so clearly what I experience was, well I can't put in words how it felt after all these years. It was also able to pick out parts of what I said to reflect back positive, encouraging things about myself. It was able to offer some suggestions which were actually helpful and which I am going to try as a coping mechanism. Best of all for someone like me, with huge issues around shame, I could speak openly and honestly about how I felt without any fear or shame around what the therapist might think of me.

So despite Chat GPT not being a person, I found I was able to get the emotional benefits as if it were a person understanding me, without the disbenefits of it being a person I might feel ashamed to tell how I feel.

Also, unlike a human therapist, it remembers and is able to respond to everything you say.

It was hands down better than nearly all human therapists/ counsellors/ support workers I have seen.

And it was free.

I realise for those able to afford long-term intensely skilled therapy for complex issues, a skilled experienced therapist is far preferable.

But in my experience, and that of many others, most therapists are pretty poor and expensive.

So surely Chat GPT will become a first point of call for many with mental health issues which will reduce the number of those who decide they need a human therapist?

Hi @GPTtherapist

Good to hear that Chat GPT worked well for you. A few quetions and comments:

Who owns the data from the free conversation/consultation you had? Is it held securely? Where is it stored? (could be forever). I would proceed with caution unless you 100% can answet those (just of the top of my head) and no doubt other questions relating to security and privacy.

Also, "most therapists are poor and expensive" is inaccurate because, for that to be true, you would need to have checked the rates of "most therapists".

It is more likely that the therapists you hsve price checked seemed expensive to you. it is a bugbear for me when people use broadbrush strokes and social "truths" that isn't accurate/correct.

Sidenote: what is expensive for you is not for someone else.

To be a therapist, look for proper training, insurance and membership to professional bodies. They must also pay their supervisor to discuss "tricky cases" and do Continued Professional Development courses. This is aside from owning or renting bricks and mortar to hold the sessions and their prep time/post session notes. So yes there are expenses to be covered. Your information (in UK) would be stored in accordance to GDPR.

Just giving a fuller picture and raising some points for consideration x

BabyCatFace · 10/07/2025 16:51

Words · 10/07/2025 16:45

It was an nhs app called limbic

Why did you ask an AI that is designed to streamline mental health assessments about having a pet elephant? Were you trying to catch it out? Training LLMs is expensive and time consuming. A LLM that is designed to help with streamlining mental health assessments will not be trained on the same information that ChatGPT has been trained on. So responding to a thread about ChatGPT when you don't understand how LLMs are designed and trained is just ignorant.

BoredZelda · 10/07/2025 16:51

It will put bad therapists out of business. There will always people who prefer a human connection so the good therapists will stay in business.

BabyCatFace · 10/07/2025 16:52

Words · 10/07/2025 16:49

The title said Chat GPt et al.
I am sure you are sophisticated enough to know what that implies.
@BabyCatFace

You're right - I missed 'et al' and that is my mistake. However I still don't understand what point you were trying to make. Limbic doesn't know how to answer questions about pet elephants? Well why would it? It's not trained to answer questions like that.

Therapee · 10/07/2025 16:57

I've said this on another similar thread, but I am a therapist, and I see my own therapist, and I also use chatgpt ""as a therapist". I think chatgpt can be brilliant as an ad hoc, validating, level-headed and "empathic" sounding board, but it's definitely not the same as being actually in the room with someone, or even having a video session.

There's zero "risk" - I have no fear that chatgpt will judge me, for example. The human presence that my therapist has offered me is utterly unique and precious to me. Therapy has often felt uncomfortable, exposing, and sometimes even angering in a way that has ultimately been profoundly life changing to me. I know without a shadow of doubt that chatgpt could never offer a similar experience to being seen/heard by a wise human.

That said I'm so glad that people have access to the free and immediately available "therapy" that chatgpt offers. We can all benefit from a bit of talking and "listening", and generally sound advice.

Swonderful · 10/07/2025 16:58

Isn't the point of a therapist to pick up on what you don't realise and tell you things you don't want to hear?

My therapist gently told me some language I use about myself and others that wasn't helpful - stuff I've picked up from childhood.

She's told me to get off the news and listen to less podcasts. Really practical stuff rather than a load of general waffle.

Itiswhysofew · 10/07/2025 17:03

I tried it first time last week to see if it could explain a feeling that I've been experiencing for years. It was actually difficult for me to phrase my question, so I wasnt sure if it would understand. Blow me down, it got it straight away and the answers it gave really made sense.

I don't know if it'll take over, but I can see why you're pleased with your outcome.

My cousin has left his career as a psychologist.

Words · 10/07/2025 17:04

No I went through all those questions obediently.
Thanks for calling me ignorant though, denigrating my experience and moving the goal posts.
. I have been on MN for years and am used to having reasonable debate with a side dose of cynical humour.
I am leaving this now as there is a certain narrow minded viciousness developing.
I was hoping to have a decent conversation about the limits and benefits of this technology.

motheroflittledragon · 10/07/2025 17:08

i think like a lot of professions chat gpt is able to replace some low level work for some therapists. same with a lot of admin work.

i recently had to ask some straight forward questions in regards to inheritance etc. i did ring up a solicitor who basically said the same thing my chat gpt did but cost me 100 pounds for a video call consultation.

AutumnLeaves91 · 10/07/2025 17:10

Absolutely hate ChatGPT and general AI