Meet the Other Phone. Child-safe in minutes.

Meet the Other Phone.
Child-safe in minutes.

Buy now

Please or to access all these features

AIBU?

Share your dilemmas and get honest opinions from other Mumsnetters.

Chat GPT

189 replies

ChatGPTamazing · 07/12/2025 23:17

Does anyone else find they are using chat GPT much more for advice now? It’s actually amazing, I find people on mumsnet can be very harsh and mean for no reason but I asked chat GPT relationship advice and it was amazing, really helped me and went over it all and helped me understand the situation without judgement! Aibu to say how amazing chat GPT is and use it for all future advice?! Just wish it had been around in the past when I really needed it.

OP posts:
FerrisWheelsandLilacs · 08/12/2025 09:29

OriginalUsername2 · 08/12/2025 01:32

I haven’t found this to be true. I used it extensively for a good few months asking it all my stupid questions, it told me straight when I was wrong in a kind way like “Yes, it can seem that way but..”

No judgements, insults or eyerolls, which is refreshing.

Just so you know “yes it can seem that way” is not a way to tell someone “straight” that they’re wrong. In fact, it leads you to believe that you were at least partly right.

Teenytwo · 08/12/2025 09:31

murasaki · 08/12/2025 00:12

I use it mostly for cooking, i give it some cupboard and freezer contents and it plans dinner. Fab stuff.

That’s genius!

DarkEyedSailor · 08/12/2025 09:33

I've never used it and I don't want to. I don't need to at work. If I want to work something out I write it down and think it through.

GoodQueenWenceslaus · 08/12/2025 09:34

I hate it when people use it for complex stuff that it just can't handle. At work, people occasionally use it before they come to us so think they have the answers already, but all too often they're simply wrong because Chat GPT has not taken something vital into account, Then we have to waste lots of time explaining exactly why it's wrong, and it can be quite an uphill struggle.

MigGirl · 08/12/2025 09:34

@Mischance you asked chatGTP about its energy usage and you really think it gave you the correct answer?

It can be a useful tool for some things but it's definitely not always right. It draws its information from the Internet which is in its biased as not everything is recorded on the Web. Also its very American, biased as the information on the Web doesn't reflect actual real life demographics so you will never get a true completely reflection on real life. Yes its learned a lot, but its really just a large language model. It's not able to fact check information and is biased by your Internet algorithms just like news feeds are.

And building these huge servers to power it, is causing issue with power even in the US. I don't think it's something to be used lightly and certainly not everyday for regular things.

Unfortunately technology is moving faster then governments can keep up. And I really wish Google hadn't put it on default for their search engine. Most of the things I Google it either can't answer or gives the wrong information.

pandagirl93 · 08/12/2025 09:35

JudgeBread · 08/12/2025 08:58

So am I the only one who thinks this is really fucking sad and dystopian then?

Nope, I’m with you on this one!

MigGirl · 08/12/2025 09:36

JudgeBread · 08/12/2025 08:58

So am I the only one who thinks this is really fucking sad and dystopian then?

Nope I'm with you on this one.

soocool · 08/12/2025 09:37

I don't use it for quasi "human like" connection or advice. But it works for some people and that's great.

However, it is absolutely the best thing ever for planning a trip, itineraries, public transport options, the nearest medical centres and doctors and how to use them and the cost (if any), connections from place to place, things to see (that I may not have known about!) and so on. I have surrendered my soul and my time to AI for this. No more endless trawling of different sites to find out about a place.

Brilliant.

winterblueshitting · 08/12/2025 09:39

soocool · 08/12/2025 09:37

I don't use it for quasi "human like" connection or advice. But it works for some people and that's great.

However, it is absolutely the best thing ever for planning a trip, itineraries, public transport options, the nearest medical centres and doctors and how to use them and the cost (if any), connections from place to place, things to see (that I may not have known about!) and so on. I have surrendered my soul and my time to AI for this. No more endless trawling of different sites to find out about a place.

Brilliant.

The only thing is for trip planning I’ve found it doesn’t really consider actual timing

Mischance · 08/12/2025 09:40

MigGirl · 08/12/2025 09:34

@Mischance you asked chatGTP about its energy usage and you really think it gave you the correct answer?

It can be a useful tool for some things but it's definitely not always right. It draws its information from the Internet which is in its biased as not everything is recorded on the Web. Also its very American, biased as the information on the Web doesn't reflect actual real life demographics so you will never get a true completely reflection on real life. Yes its learned a lot, but its really just a large language model. It's not able to fact check information and is biased by your Internet algorithms just like news feeds are.

And building these huge servers to power it, is causing issue with power even in the US. I don't think it's something to be used lightly and certainly not everyday for regular things.

Unfortunately technology is moving faster then governments can keep up. And I really wish Google hadn't put it on default for their search engine. Most of the things I Google it either can't answer or gives the wrong information.

I did include an exclamation mark when I said I would ask it about its resource usage! I assumed that people would realise the tongue-in-cheek element!

NemesisInferior · 08/12/2025 09:41

I've used it occassionally, but only for really factual things that it can answer quicker than a straightfoward google search.

You absolutely have to take everything LLM say with a pinch of salt - like the absolute nonsense posted above re: energy usage - and I would never, ever interact with one in the same way as I would a human. Asking a chatbot for emotional advice and subsequently describing it as "kind" is just fucking madness and shows a fundamental misunderstanding of how these things actually work.

Periperi2025 · 08/12/2025 09:43

I use it as a starting point for most topics i search and advice i need, it gives me the correct terminology to get started and organises my thoughts.

I like it for bouncing around interior design ideas, for therapy and journaling (i was never good at journalling until chatgpt came along), and recently i used it for creating a cpd plan and interview practice for a promotion i want to apply for (this was impressive, i wish I'd had it 20 years ago).

Mischance · 08/12/2025 09:43

I do think it is sad that people need to use it for emotional support, but also recognise that in the absence of good mental health services it may be all that some can find. That is a sad indictment on the services.

But for practical things it can be really helpful as long as you recognise that it might not always be right and use it selectively. And ask for citations of research.

Like any other tool it can be used wisely or unwisely. It is second to none for the speedy design of a poster for a coffee morning!

shivermetimbers77 · 08/12/2025 09:45

I find it helpful but I also worry that every time we ‘feed it’ our thoughts/feelings/innermost insecurities , it is learning more about us and will become ideally placed to manipulate us. I would love to believe it is benign and a force for good, but history makes me very cautious.

Wyksister · 08/12/2025 09:46

I’m not sure that it’s the healthiest way of getting “another” view since it largely mirrors what you’ve put in

You’ll end up with a very biased opinion

I saw a social media post recently of a dad saying that his son had used it and encouraged him to commit suicide. It was heartbreaking

Periperi2025 · 08/12/2025 09:46

Mischance · 08/12/2025 09:43

I do think it is sad that people need to use it for emotional support, but also recognise that in the absence of good mental health services it may be all that some can find. That is a sad indictment on the services.

But for practical things it can be really helpful as long as you recognise that it might not always be right and use it selectively. And ask for citations of research.

Like any other tool it can be used wisely or unwisely. It is second to none for the speedy design of a poster for a coffee morning!

Why is it any sadder than using a blank notebook for emotional support in the form of journalling, or talking to an anonymous person on a phone line who will be working to SOP or a script of some sort, and doesn't really want to be in work talking to them on their 3rd night shift in a row when their much loved pet dog is poorly at home and the kids are hyped up and sensitive in the run up to Christmas etc etc.

VikaOlson · 08/12/2025 09:47

ChatGPT just predicts what a typical answer to your query might be based on previous answers to similar questions it has found on the internet.
It's no good for factual things either, as it just gives you what an answer looks like, so will happily invent dates, laws and studies if it needs to.

Ansjovis · 08/12/2025 09:48

I'm using it for running advice and so far it has been spot on. I know it's not telling me what I want to hear as I made a typo when I was telling it one of my goal paces and it immediately responded with concern as that goal pace was too fast for the evidence I had given it so far.

EyeLevelStick · 08/12/2025 09:50

Zanatdy · 08/12/2025 05:55

Yes, think I have diabetes and it was brilliant last night when I had my first hypo. I use it most days now, so much better than google searches.

Cripes. You think you have undiagnosed diabetes (so presumably not on insulin) but you are having hypos?

What did ChatGPT advise?

Periperi2025 · 08/12/2025 09:50

EyeLevelStick · 08/12/2025 09:50

Cripes. You think you have undiagnosed diabetes (so presumably not on insulin) but you are having hypos?

What did ChatGPT advise?

Eek...So much wrong here!!

Annijo · 08/12/2025 09:51

winterblueshitting · 08/12/2025 09:39

The only thing is for trip planning I’ve found it doesn’t really consider actual timing

I have also found this. It’s good for ideas but you do have to check separately. It is also very affirmative and so if you throw ideas and questions in, it will say it sounds fantastic.

JetFlight · 08/12/2025 09:51

I was using it for advice about getting organised in one chat and advice about a stray cat in other. I accidentally posted something about the cat in the chat about me. The confusing response of “yes the snuggle safe will be very grounding and warming for you” made me realise that ChatGPT just validates whatever you’re thinking.

winterblueshitting · 08/12/2025 09:52

Annijo · 08/12/2025 09:51

I have also found this. It’s good for ideas but you do have to check separately. It is also very affirmative and so if you throw ideas and questions in, it will say it sounds fantastic.

Yeah for fun I planned a trip from the uk to Paris, then Nice, Ventamiglia and then Milan and it said I could do it all in seven days 😂

twiddlingthumbs69 · 08/12/2025 09:55

I’ve just asked it if it uses confirmation bias. This is the answer:

Short answer: No — I don’t “use” confirmation bias.

Long answer (straight, evidence-based, no fluff):

  1. What I am
I’m a statistical model that predicts the most likely next text based on patterns in data. I don’t have beliefs, motives, or a psychology — so I don’t have cognitive biases in the human sense.
  1. Where bias can appear
Large language models can show biased patterns if the data they were trained on contains them. That isn’t “confirmation bias”; it’s training-data bias. But I’m designed to avoid reinforcing a user’s assumption and instead check it against evidence.
  1. What I actually do
When you ask something that has a factual answer, I’m required to base it on verifiable information (and for certain categories, to check with search). That’s the opposite of confirmation bias; it’s a safeguard against it.
  1. What I do not do
  • I don’t assume what you want to hear.
  • I don’t selectively pick info that matches your belief.
  • I don’t “choose” information — I follow rules about verification, safety, and evidence.
Redhairandhottubs · 08/12/2025 09:56

BeaRightThere · 08/12/2025 09:04

But do you realise that it will never tell you that you're wrong or put forward a contrary point of view? It's just going to tell you what you want to hear and very likely reinforce your own interpretations. It's not unbiased.

It does tell you if you’re wrong. I asked it once about a dispute I was having with my partner. At the same time as validating how I was feeling, it also pointed out how some of the things I had said could have made her feel. It made me look at things from a completely different perspective.