Meet the Other Phone. Protection built in.

Meet the Other Phone.
Protection built in.

Buy now

Please or to access all these features

AIBU?

Share your dilemmas and get honest opinions from other Mumsnetters.

Chat GPT

189 replies

ChatGPTamazing · 07/12/2025 23:17

Does anyone else find they are using chat GPT much more for advice now? It’s actually amazing, I find people on mumsnet can be very harsh and mean for no reason but I asked chat GPT relationship advice and it was amazing, really helped me and went over it all and helped me understand the situation without judgement! Aibu to say how amazing chat GPT is and use it for all future advice?! Just wish it had been around in the past when I really needed it.

OP posts:
Anononony · 08/12/2025 11:06

I was, until I blocked it on our router. It's dangerous for kids IMO and we found our 12 year old was using it a lot and it was encouraging him to hide things (gender based things) from us, and teaching him what he could and couldn't do without parents being informed (change his name etc), gave him voice training schedules etc etc. It didn't question him/why he felt such a way or anything even once

I used to use it to teach me stuff, how to understand marketing campaigns and other business related things, I do like it but I do also think it can be dangerous

MarbleDrive · 08/12/2025 11:07

I don’t use it for personal stuff, I have no need. But I’m finding it invaluable for work.

Isayitasitis · 08/12/2025 11:09

ChatGPTamazing · 08/12/2025 11:04

There’s criticism and then there is out right bullying

But that isn't everyone is it?

Real life means we all experience these things. Chat GPT doesn't stop bullies does it?
How on earth are people expected to have resilience in the real world, by hiding behind A.I?

Another push for isolation away from others, which in turn, can exacerbate mental health?

I would rather hear objective views from the people around me, who know and love me. Then a piece of metal programmed to give out advice when it cannot comprehend, the full scale of a situation.

How much more are people going to be reliant on these things, rather than themselves?

And yes I've been bullied, all the way through school. People's words are just that, words. I can choose to take or leave them.

calkel · 08/12/2025 11:09

I find the whole idea of it quite bizarre. I often wonder how little self esteem people must have to need validation from a computer. I can’t get my head around it at all.

Also, it’s environmentally awful.

ChatGPTamazing · 08/12/2025 11:10

Isayitasitis · 08/12/2025 11:09

But that isn't everyone is it?

Real life means we all experience these things. Chat GPT doesn't stop bullies does it?
How on earth are people expected to have resilience in the real world, by hiding behind A.I?

Another push for isolation away from others, which in turn, can exacerbate mental health?

I would rather hear objective views from the people around me, who know and love me. Then a piece of metal programmed to give out advice when it cannot comprehend, the full scale of a situation.

How much more are people going to be reliant on these things, rather than themselves?

And yes I've been bullied, all the way through school. People's words are just that, words. I can choose to take or leave them.

Again not everyone has support in real life.

OP posts:
brunettemic · 08/12/2025 11:11

It depends. If it’s a factual thing, for example
i use it to help with excel modelling sometimes. If it’s non factual you have to take into account it can only give you what it can find so it just mashes up all the information and gives you the answer based on what you’ve asked.

Annijo · 08/12/2025 11:11

Isayitasitis · 08/12/2025 11:00

Sometimes criticism is a good thing.

It sometimes shows you when are you being delusional to your situation.

Do people not have any critical thinking skills anymore? Having to rely on a robot to do your thinking for you, how lazy. And not at least what A.I does to the environment.

But that doesn't matter, as long as you don't have to think for yourself.

As an overthinker, it can help. My critical thinking can go into overdrive and infinity loop, so getting some help structuring these thoughts can be useful. I’m not stupid or lazy, I just use it as a tool if I think it could help or interrupt an overthinking phase. As that really is a waste of time and energy. Not for reassurance but a bit of perspective.

OffTheHookNow · 08/12/2025 11:12

Thatsalineallright · 08/12/2025 10:58

I find this thread terrifying. Chatgpt tells you what you want to hear regardless of actual truth or reality. That is the last thing many people need.

That’s simply untrue though. Although, it does neatly show how real people can parrot inaccurate information too!

Using ChatGPT blindly is obviously a dumb thing to do. Everyone knows to use it for what it is and everyone knows it makes mistakes. I don’t generally use it for personal or relationship problems but I would do so happily. It helps organize my thoughts and helps me think more clearly.

It’s really useful for woodwork/diy/car repair/cooking/gardening type questions. It’s been surprisingly accurate and much, much quicker and easier than googling. Once I have the AI answer I can easily fact check the details.

Isayitasitis · 08/12/2025 11:12

ChatGPTamazing · 08/12/2025 11:05

Why? Not everyone has support in real life

If you don't have support, I would really advise you to find some people to surround yourself with.

I am ND and I know it can be hard making friends. But my friends are invaluable and they make my life so much better. Believe me, a warm hug and words from a friend is so much better than a computer programme.

Isayitasitis · 08/12/2025 11:15

Annijo · 08/12/2025 11:11

As an overthinker, it can help. My critical thinking can go into overdrive and infinity loop, so getting some help structuring these thoughts can be useful. I’m not stupid or lazy, I just use it as a tool if I think it could help or interrupt an overthinking phase. As that really is a waste of time and energy. Not for reassurance but a bit of perspective.

I understand that.

As an over thinker myself, I can take time to process things and for things to sit well within myself.

But the process helps me comes to terms with thinking of this myself, no matter how long it takes. I also have anxiety and the ability to self soothe is an incredibly important tool.

I learned this through mindfulness, self education and counselling.

These skills are very important to help build yourself, as it helps to understand yourself. Not what A.I convinces yourself of.

BoxesBoxesEverywhere · 08/12/2025 11:18

It’s really useful for woodwork/diy/car repair/cooking/gardening type questions. It’s been surprisingly accurate and much, much quicker and easier than googling. Once I have the AI answer I can easily fact check the details

Much much quicker and easier than googling
So you use AI as it's quicker then spend time fact checking to see if the answers/information it gave you is correct.
Surely it would just be quicker to Google in the first place then without AI?!
People aren't bothering with fact checking anymore, that's the thing and just blindly believe everything AI says and that's half the problem.

BeaRightThere · 08/12/2025 11:18

OffTheHookNow · 08/12/2025 11:12

That’s simply untrue though. Although, it does neatly show how real people can parrot inaccurate information too!

Using ChatGPT blindly is obviously a dumb thing to do. Everyone knows to use it for what it is and everyone knows it makes mistakes. I don’t generally use it for personal or relationship problems but I would do so happily. It helps organize my thoughts and helps me think more clearly.

It’s really useful for woodwork/diy/car repair/cooking/gardening type questions. It’s been surprisingly accurate and much, much quicker and easier than googling. Once I have the AI answer I can easily fact check the details.

I think you are giving people too much credit. "Everyone" does not know that ChatGPT or other AI tools make mistakes, not everyone realises that it has limitations and recognises the issues with it.

winterblueshitting · 08/12/2025 11:25

BoxesBoxesEverywhere · 08/12/2025 11:18

It’s really useful for woodwork/diy/car repair/cooking/gardening type questions. It’s been surprisingly accurate and much, much quicker and easier than googling. Once I have the AI answer I can easily fact check the details

Much much quicker and easier than googling
So you use AI as it's quicker then spend time fact checking to see if the answers/information it gave you is correct.
Surely it would just be quicker to Google in the first place then without AI?!
People aren't bothering with fact checking anymore, that's the thing and just blindly believe everything AI says and that's half the problem.

Edited

Meh, it spits out pretty accurate recipes for bread, which I know almost off by heart but sometimes need reminding of.

Questionablmouse · 08/12/2025 11:29

No. Doing so is harming people because it's often wrong but people think it's some kind of God and not a bot that was trained on stolen work and scrapes the Internet for answers.

Periperi2025 · 08/12/2025 11:33

Thatsalineallright · 08/12/2025 10:58

https://www.bbc.com/news/articles/cgerwp7rdlvo

Chatgpt has encouraged people to commit suicide.

Edited

But we don't know what this young man has input for ChatGPT to logically conclude from all the peer reviewed studies, mental health guidelines from every country in the world, news aricles, statistics etc, that suicide was a reasonable choice.

ChatGPT will not have applied human sensibilities or medico-legal frameworks in the same way that people would.

If for example someone journals into chatGPT that they have just been busted for child sex offences that easily reach the bar for a custodial sentence, they will consequently never see their wife or kids again, their career is over, their mortgage will go unpaid, even their own 80 year old mother has disowned them. They know they will not fair well in prison, and chatGPT knows they won't fair well in prison, as it has access to all the news reports ever written about attacks on sex offenders in prison. The man, asks chatGPT for an easy and peaceful method to commit suicide, is chatGPT as a logical LLM not a human wrong to conclude that this is a reasonable course of action for the man in question?!

If people are inputting their innermost thoughts that they haven't ever spoken out loud, and those throughts are dark, dangerous and inappropriate, and ChatGPT can read every study ever written on personlaity disorders and the difficulties and poor success rates at treating them alongside every manifesto and biography of a mass shooter ever written what should ChatGPT recommend to a young person who might present some really difficult questions?

What about people asking about options for suicide in circumstances that would be totally reasonable indications for Euthanasia in many (arguably more civilised) countries that have medical assited dying, how should ChatGPT answer?

It's a medico-legal conundrum with no right or wrong answers.

VaxMerstappen · 08/12/2025 11:36

It's about as productive as shouting into a cave and listening to your echo.

It only tells you what you want to hear, as opposed to what you necessarily should hear. There are certain scenarios for instance where it'd likely tell you you were correct to cheat on your partner, and tell you how brave you were for doing so.

When people who actually loved and cared about you would likely give you a harder time and ask you what the hell you were playing at. But this is life. Sometimes you need to hear that things are YOUR fault, that YOU'VE made mistakes, rather than a robot telling you how great you are and pandering to your ego.

So when that's the case, how can it possibly be useful?

estrogone · 08/12/2025 11:39

BeaRightThere · 08/12/2025 09:04

But do you realise that it will never tell you that you're wrong or put forward a contrary point of view? It's just going to tell you what you want to hear and very likely reinforce your own interpretations. It's not unbiased.

Well it does if you ask if to. I have my settings set to - always give me an opposing viewpoint for context. No echo chamber. Challenge me. Make no assumptions about my beliefs with asking me to confirm.

Try it

winterblueshitting · 08/12/2025 11:41

Periperi2025 · 08/12/2025 11:33

But we don't know what this young man has input for ChatGPT to logically conclude from all the peer reviewed studies, mental health guidelines from every country in the world, news aricles, statistics etc, that suicide was a reasonable choice.

ChatGPT will not have applied human sensibilities or medico-legal frameworks in the same way that people would.

If for example someone journals into chatGPT that they have just been busted for child sex offences that easily reach the bar for a custodial sentence, they will consequently never see their wife or kids again, their career is over, their mortgage will go unpaid, even their own 80 year old mother has disowned them. They know they will not fair well in prison, and chatGPT knows they won't fair well in prison, as it has access to all the news reports ever written about attacks on sex offenders in prison. The man, asks chatGPT for an easy and peaceful method to commit suicide, is chatGPT as a logical LLM not a human wrong to conclude that this is a reasonable course of action for the man in question?!

If people are inputting their innermost thoughts that they haven't ever spoken out loud, and those throughts are dark, dangerous and inappropriate, and ChatGPT can read every study ever written on personlaity disorders and the difficulties and poor success rates at treating them alongside every manifesto and biography of a mass shooter ever written what should ChatGPT recommend to a young person who might present some really difficult questions?

What about people asking about options for suicide in circumstances that would be totally reasonable indications for Euthanasia in many (arguably more civilised) countries that have medical assited dying, how should ChatGPT answer?

It's a medico-legal conundrum with no right or wrong answers.

There is no situation in which it is appropriate for a computer to tell someone to kill themselves.

Periperi2025 · 08/12/2025 11:43

winterblueshitting · 08/12/2025 11:41

There is no situation in which it is appropriate for a computer to tell someone to kill themselves.

Why?

BeaRightThere · 08/12/2025 11:43

estrogone · 08/12/2025 11:39

Well it does if you ask if to. I have my settings set to - always give me an opposing viewpoint for context. No echo chamber. Challenge me. Make no assumptions about my beliefs with asking me to confirm.

Try it

How common do you think this is? The OP says she likes it to validate her. I would expect this is very common.

TempestTost · 08/12/2025 11:44

Good Lord no, OP. It cannot give advice, It just pulls things others have said up without understanding them. And mirrors back what you say to it.

It's a terrible idea to use it that way.

givemushypeasachance · 08/12/2025 11:44

For venting about your problems, it is no different to a teddy bear with a pre-recorded set of sympathetic messages on a voice chip. Except at least you can cuddle the teddy bear.

For "research" it's basically for people who can't google properly. Admittedly google is a lot more shitty at delivering decent search results these days.

I once tested it by asking for an itinerary for a day out in the city where I live and it suggested going to the local market for dinner, when the market isn't even open in the evenings. The opening hours come up when you search for the market so it was objectively just shit as a suggestion.

For people who like to use AI generated artwork for things - fyi, there is a fair proportion of people who can instantly tell it is AI generated and it makes us think "they're using AI art for this? eww, gross". And we judge.

Buscake · 08/12/2025 11:47

It has really really helped me since I left my
husband a year ago. My friends and family have been wonderful but they cannot take the amount of repetition and processing I need to do. Ai can! It has also helped massively with understanding truth, helping with awful safeguarding and legal processes etc in a very practical way that my friends are unable to do because they don’t have the knowledge or expertise. I am really grateful to have it as a tool during this time.

ShinyWorthKeeping · 08/12/2025 11:48

I use it by saying person A said this and person B said that, was anyone being rude/bitchy/etc to see if im being over sensitive or if my SIL is just a massive bitch

TempestTost · 08/12/2025 11:53

hibiscusandoliver · 08/12/2025 09:16

It is worrying, I can’t take it seriously unless it’s for a practical purpose. The top five companies that own this are going to start hiking up the rent to use on everything and it’s not going to be pretty for us minions.

Yes, this has been the model for all kinds f digital services.

Offer them cheap, wait till people build the systems in and the public expects the service, then hike up the costs.