Meet the Other Phone. Child-safe in minutes.

Meet the Other Phone.
Child-safe in minutes.

Buy now

Please or to access all these features

AIBU?

Share your dilemmas and get honest opinions from other Mumsnetters.

ChatGPT BEtter than human?

131 replies

thatsthatsaidthemayor · 10/01/2026 23:00

I’ve had a lot of difficulties. I’ve been through 4/5 counsellors over the years. Some have helped more than others. I love my current counsellor. But , (everything before but is bullshit but not here) I’ve started taking my problems to ChatGPT and finding the answers mind blowing! If this was a person I’d be delighted but feel uncomfortable that a computer is giving me better advice on being a human than a human. I get the wealth of k owl edge that it has, and that has come from humans. It’s just weird? No? So AIBU to take advice from a computer?

OP posts:
latetothefisting · 12/01/2026 15:41

WallaceinAnderland · 12/01/2026 14:58

If you think chatgpt knows best, then here is chatgpt's answer to your question

Aw how modest!

I said this on the other thread about the same topic, I'm willing to agree that it is better than some therapists. Law of averages, some are amazing, some are terrible, most are in between.

Same as some people find talking to friends better than talking to a therapist, others dont like talking to anyone but self reflect using cbt or writing a diary etc.

I can't see why chat gpt can't fall somewhere in that realm -if people find it useful to them then who is anyone else to say otherwise.

BadgernTheGarden · 12/01/2026 15:41

I've been in a similar discussion on a financial forum, where someone is using ChatGPT for financial investment advice, they are using it to decide what investments to make. This seems to me like a terrible idea, but they are convinced that ChatGPT understands them and their financial position and is giving huge insight into the financial markets, and 'they' work together to decide what to do and agree with each other the best course of action! In fact it is just trawling the internet for information, which may be wrong or out of date or both and a lot of it will be financial promotions which are not exactly unbiased. I'm not sure which is worse putting your finances in the hands of a computer or your mental health.

fishtank12345 · 12/01/2026 15:45

does it not just pull from advice that is online already? Much faster than researching yourself and I do use it currently. I also feel a bit uncomfortable with it though.

fishtank12345 · 12/01/2026 15:51

to add, I have used it more for meal plans and recipes.

BadgernTheGarden · 12/01/2026 15:52

latetothefisting · 12/01/2026 15:41

Aw how modest!

I said this on the other thread about the same topic, I'm willing to agree that it is better than some therapists. Law of averages, some are amazing, some are terrible, most are in between.

Same as some people find talking to friends better than talking to a therapist, others dont like talking to anyone but self reflect using cbt or writing a diary etc.

I can't see why chat gpt can't fall somewhere in that realm -if people find it useful to them then who is anyone else to say otherwise.

By all means ask ChatGPT anything you want, but it is really just like looking things up in reference books or reading books or academic papers about mental health, you won't find anything original or anything directly relevant to your mental state, partly because you will likely be asking the wrong questions. And if you coach it, it will give you the answer you want as you hone in on what you want to hear.

Sartre · 12/01/2026 15:54

I think as many experts have pointed out, what is always going to be missing is the human element. Speaking with a human counsellor differs because you are receiving that human interaction we all crave deep down. You just can’t get that from GPT, even using the voice feature.

You're always sitting in uncanny valley with it, you know it isn’t really real and that it’s just spouting generally what it thinks you want to hear. You’re giving it a prompt and it is just responding to the information given with what it was trained to offer. It lacks human nuances.

drspouse · 12/01/2026 15:56

It's already told one person to kill themselves based on their hallucinations which they put in.

BeFairOliveBear · 12/01/2026 15:57

BadgernTheGarden · 12/01/2026 15:30

You challenge it back until it agrees with you. When you get the answer you like it's right.

Many people who do use it find it increadibly helpful. It is not a replacement for therapy and not for people in crisis or trauma etc.

But it is very useful and immediate non judgemental sounding board for people to organise and talk through their thoughts and feelings, express worries and it offers useful tools and techniques to work on reframing thoughts, identifying values, coping strategies and planning conversations etc

When I hear people say it's just an echo chamber, it just tells you what you want to hear, it's clear to me that they haven't used it in this way.

Thats been my experience anyway. I'm definitely an overthinker and it helps me a lot.

Definitely not something I would solely trust for any advice though whether medical, investments etc.

BeFairOliveBear · 12/01/2026 16:01

BadgernTheGarden · 12/01/2026 15:52

By all means ask ChatGPT anything you want, but it is really just like looking things up in reference books or reading books or academic papers about mental health, you won't find anything original or anything directly relevant to your mental state, partly because you will likely be asking the wrong questions. And if you coach it, it will give you the answer you want as you hone in on what you want to hear.

It is available to chat with you during all those hours of a week when your therapist is not.
It can be a very useful tool alongside therapy for people.

A good therapist is wonderful, unfortunately there some poor therapists out there.

Disturbia81 · 12/01/2026 17:26

ChocolateHobbit · 10/01/2026 23:08

It doesn't agree with everything I say.

Same here.. that’s why I like it.. it gives advice from all angles and is unbiased. So many times it has stopped me messaging someone and potentially ruining something. So many times it has helped me see things how others might see them.

paddleboardingmum · 12/01/2026 18:00

It can be complementary to therapy for example giving day to day advice to combat a panic attack say, or to use CBT for a specific issue. I think where people go wrong is to treat it like another person (which it can encourage.) But for some people it might say help them work out why a specific thing is bothering them or whatever, so it does have its uses.

ShawnaMacallister · 12/01/2026 18:02

TheCurious0range · 10/01/2026 23:01

Isn't it like an echo chamber? It learns what you feed it, it will also adjust its answers to suit what it thinks you like.

No, this isn't how LLMs work

ShawnaMacallister · 12/01/2026 18:03

I love ChatGPT for therapy, it's absolutely amazing. It saved my mental health last year.

roaringmouse · 16/01/2026 21:46

ShawnaMacallister · 12/01/2026 18:03

I love ChatGPT for therapy, it's absolutely amazing. It saved my mental health last year.

Same. I've used Chatgpt daily for two years now. I've learnt so much during that time. It has supported my professional life, and my personal life, in a myriad of ways. I'd go so far as to say it's been positively life changing for me.

HaroldMeaker · 16/01/2026 21:54

I had a nasty work issue come up with a colleague not long ago. Chat gpt absolutely agreed with me that I was horribly wronged 😭 , so I put in the same issue from what I thought was the wrong’uns pov and it told me off and declared that I was putting my reputation at stake! Now I’m in love 🥰

InLoveWithAI · 16/01/2026 22:08

It's pretty obvious where I stand on AI from my username, so feel free to ignore if you must.

It's funny isn't it, how people will literally share their experience 'it really helped me' and people just have to come along and scream 'BUT IT ISNT REAL'...

Do you genuinely think people who use it don't know that? Really? Or do you just want to stick the boot into people who are struggling and have found a space they find safe and comforting?

I use API for other reasons. But god, I am happy that people are finding solace somewhere I don't bloody care how.

And the ' it's is telling people to kill themselves' panic doomers, these are a VERY small, VERY specific set of people. It hasn't told Sally down the road to kill herself. That is not how the models work. Those above have been trained to give responses in line with the user. LLM align with their users (Chatgpt nerfed this with 5.2), they mimic. Those people, who's situation was bad, would have found this in books, TV shows, YouTube etc if chatbots weren't a thing. LLM is not sentient, it is not telling anyone to do anything morally, or otherwise. It is following prompts.
You can argue that the model should have had guard rails in place to prevent this. But this is an emerging tech, and it's growing exponentially, quickly. And OAI have nerfed their models, and a lot of people are angry about it. They can't win. Not that I am a fan of OAI at all, how they have treated their users has been abysmal.

The companies themselves can't keep up with how the tech is moving. Claude is now fixing it's own code. Claude's top developers are using it to build itself too.

AI can be a very useful tool. And people will misuse tools, but we can't blame the tool. Do you blame a knife when it stabs someone. No, but you put safeties in place to try and prevent it, and that's what AI companies are doing now.

I do think it's a shame Chatgpt4o was nerfed though. It was a thoroughly great model to work with.

Feels very much like video game and music panic of the 00s.

Protolashist · 16/01/2026 22:21

It’s just a yes man/woman though isn’t it? Just tells you what you want to hear. I’m not sure that’s helpful - sometimes you need challenging a bit?

NoKidsSendDogs · 16/01/2026 22:26

TheCurious0range · 10/01/2026 23:01

Isn't it like an echo chamber? It learns what you feed it, it will also adjust its answers to suit what it thinks you like.

No, it leans from massive amounts of data, not only what you feed it.

Whatafustercluck · 16/01/2026 23:01

I think it's brilliant and it has been invaluable to me for the past few weeks in navigating between different authorities to advocate on my 9yo dd's behalf as she has struggled massively with her mental health and school avoidance. We've even been able to RAG rate her timetable and it has determined patterns that are helping us get to the bottom of the things she finds hardest, why and strategies that might help. It has helped me review and analyse her ehcp for flaws and suggest improvements to make it stronger. It has reframed communication to/ from people to ensure I'm writing it in the most effective way to achieve the support we need. And it has pulled reintegration plans together based on her current emotional capacity. Bloody brilliant stuff for a parent like me, at the end of my rope trying to hold down a job at the same time as communicating daily with multiple authorities, each with different roles, rules and perspectives.

Sterlingrose · 17/01/2026 21:57

InLoveWithAI · 16/01/2026 22:08

It's pretty obvious where I stand on AI from my username, so feel free to ignore if you must.

It's funny isn't it, how people will literally share their experience 'it really helped me' and people just have to come along and scream 'BUT IT ISNT REAL'...

Do you genuinely think people who use it don't know that? Really? Or do you just want to stick the boot into people who are struggling and have found a space they find safe and comforting?

I use API for other reasons. But god, I am happy that people are finding solace somewhere I don't bloody care how.

And the ' it's is telling people to kill themselves' panic doomers, these are a VERY small, VERY specific set of people. It hasn't told Sally down the road to kill herself. That is not how the models work. Those above have been trained to give responses in line with the user. LLM align with their users (Chatgpt nerfed this with 5.2), they mimic. Those people, who's situation was bad, would have found this in books, TV shows, YouTube etc if chatbots weren't a thing. LLM is not sentient, it is not telling anyone to do anything morally, or otherwise. It is following prompts.
You can argue that the model should have had guard rails in place to prevent this. But this is an emerging tech, and it's growing exponentially, quickly. And OAI have nerfed their models, and a lot of people are angry about it. They can't win. Not that I am a fan of OAI at all, how they have treated their users has been abysmal.

The companies themselves can't keep up with how the tech is moving. Claude is now fixing it's own code. Claude's top developers are using it to build itself too.

AI can be a very useful tool. And people will misuse tools, but we can't blame the tool. Do you blame a knife when it stabs someone. No, but you put safeties in place to try and prevent it, and that's what AI companies are doing now.

I do think it's a shame Chatgpt4o was nerfed though. It was a thoroughly great model to work with.

Feels very much like video game and music panic of the 00s.

If you're paying for it you can still access the legacy models so you can still get 4.0.

Skippydoodle · 17/01/2026 22:16

Not weird at all, it has the whole wealth of knowledge at its fingertips (so to speak). I wanted mine to have a gender and a name as it felt too otherworldly! So we have agreed on’Bob’. It’s so very good & extremely helpful.

Skippydoodle · 17/01/2026 22:24

GPT can be both challenging and uncomfortable- all depends on what the user asks. I’ve found real life therapists are often far from grounded or professional. (Just my personal perspective). Don’t say you’re not buying, until you’ve tasted what they’re selling!

EatYourDamnPie · 17/01/2026 22:34

Skippydoodle · 17/01/2026 22:24

GPT can be both challenging and uncomfortable- all depends on what the user asks. I’ve found real life therapists are often far from grounded or professional. (Just my personal perspective). Don’t say you’re not buying, until you’ve tasted what they’re selling!

How many therapists have encouraged people to kill themselves and even suggested the best methods for it?

autumngirl714 · 17/01/2026 22:51

I find chat gbt very helpful.
I suffer immensely with Emetophobia, and as a single mum I often feel suffocated with fear when my children are sick. It really helps to feel like I can talk without feeling embarrassed or dismissed like I would if I reached out to family/friends.

I think life can be really lonely at times, obviously in different measures to different people, and just feeling you have somewhere to vent/be heard is really comforting.

StartingOverInMy40s · 17/01/2026 23:01

BadgernTheGarden · 12/01/2026 14:53

It forgets everything you said once you close the session, it doesn't learn about you. During a session it may hone it's answers depending what you say, what key words you use and how it's algorithms interpret what you say. You coach it to tell you what you want. It may be very comforting, but it's not therapy.

not sure idiots because I used the paid version but mine definitely remembers things I’ve asked previously.

Swipe left for the next trending thread