Meet the Other Phone. Child-safe in minutes.

Meet the Other Phone.
Child-safe in minutes.

Buy now

Please or to access all these features

Chat

Join the discussion and chat with other Mumsnetters about everyday life, relationships and parenting.

Chat GPT is amazing

213 replies

Tickets25 · 08/11/2025 08:48

I know AI can be scary but I've struggled with something for over 30 years and last night chat gpt framed it for me in a way that years of therapy never has and it's given me peace.
It's fabulous.

No point to this post, just wanted to say!

OP posts:
Thread gallery
12
parietal · 08/11/2025 08:59

You got lucky.

It has also driven people to suicide and often invents information in its answers. I won’t say lies because that implies the system knows what is true and what is false. It doesn’t know and just produces bullshit in answer to some question.

Talkingtomyhouseplants · 08/11/2025 09:12

parietal · 08/11/2025 08:59

You got lucky.

It has also driven people to suicide and often invents information in its answers. I won’t say lies because that implies the system knows what is true and what is false. It doesn’t know and just produces bullshit in answer to some question.

Couldn’t agree more. Chat GPT and other generative AI technology is not amazing. It is extremely damaging. Using it is not a neutral act. It is contributing to climate change on a considerable scale - places in Silicon Valley are experiencing drought for the first time. There is empirical evidence that it is literally making people stupider as we lose our critical thinking skills. There are concerns about data and privacy. People are using it for a parasocial relationship and losing their ability to form meaningful connections with others. It’s undermining of creativity, critical thinking, academia and contributing to the misinformation problem on a huge scale. And as parietal says above, has literally encouraged suicidal people to take their own lives.

People need to wake up. Stop using it

ChocolateMagnum · 08/11/2025 09:16

I teach about responsible AI use at university. I like to bring in these wider issues, which I know a lot of my colleagues seem to be clueless about. Does anyone have any links to evidence the encouragement to attempt suicide examples please?

LadyKenya · 08/11/2025 09:17

People need to wake up. Stop using it

I have read that some people are using it. I am not interested in finding out what it is about, but from my limited understanding of it, it helps some people decide what to make for dinner, depending on what they tell it they have in their fridge, and things like that. So, sort of does the thinking for them.

Pukekopalace · 08/11/2025 09:22

ChocolateMagnum · 08/11/2025 09:16

I teach about responsible AI use at university. I like to bring in these wider issues, which I know a lot of my colleagues seem to be clueless about. Does anyone have any links to evidence the encouragement to attempt suicide examples please?

https://www.bbc.com/news/articles/cgerwp7rdlvo

A photo of Adam Raine. He has long, brown shaggy hair that is wavy. He is seen smiling, wearing a knit collared shirt with three buttons at the top. Behind him is a blurred background of foliage.

Parents of teenager who took his own life sue OpenAI

The Raine family alleges ChatGPT "actively helped" their 16-year-old son take his own life.

https://www.bbc.com/news/articles/cgerwp7rdlvo

GaudySocks · 08/11/2025 09:23

@ChocolateMagnum there was this on BBC the other day. Plus I think something else a couple of months ago about meta.

I wanted ChatGPT to help me. So why did it advise me how to kill myself?

ChocolateMagnum · 08/11/2025 09:24

Fantastic, thank you both!

ChocolateMagnum · 08/11/2025 09:26

I mean, obviously not fantastic news stories! Just good to have evidence to add to the weight of reasons to take gen AI more seriously and be far more judicious about its use!

GaudySocks · 08/11/2025 09:27

I seem to have lost my link 🤦🏻‍♀️

Im on my phone and I’ve no idea what’s happening.tried to edit and everything disappeared.

Iwanted ChatGPT to help me. So why did it advise me how to kill myself? www.bbc.com/news/articles/cp3x71pv1qno wanted ChatGPT to help me. So why did it advise me how to kill myself? www.bbc.com/news/articles/cp3x71pv1qno

Talkingtomyhouseplants · 08/11/2025 09:27

LadyKenya · 08/11/2025 09:17

People need to wake up. Stop using it

I have read that some people are using it. I am not interested in finding out what it is about, but from my limited understanding of it, it helps some people decide what to make for dinner, depending on what they tell it they have in their fridge, and things like that. So, sort of does the thinking for them.

Yes that’s exactly the problem - if you stop thinking you lose your ability to think

ShesTheAlbatross · 08/11/2025 09:30

I used chat gpt to ask about a medical issue I’ve had for years that has caused regular severe pain and has been dismissed by multiple GPs with the suggestion I just take paracetamol.
Chat GPT gave me a suggestion for a diagnosis that had never come up before in my googling because it was an issue with one part of my body causing symptoms in another, that I (as a non-medic) wouldn’t generally associate so I hadn’t been searching for the right thing.
Obviously I didn’t just blindly believe it, but was able to do more research and everything fit. I went to the GP, suggested the issue, they agreed, and referred me to the relevant specialist who confirmed it, treated it, and I am now no longer in daily pain. It’s not cured but it’s managed.

I get the ethical issues. But I had genuinely seen about 10 different GPs over the course of 10+ yrs about this issue that was causing really significant disruption to my life and none of them ever suggested this as a cause.

Purpleandgreenyarn · 08/11/2025 09:30

My husband and I were talking about this last night.
It completely erases the learning process. You basically just ask for the answer and it gives it to you, I worry how this will effect generations of people to think critically, research, go out and look for yourself.
I suppose everyone is weary of new technology at the beginning and this is just the next phase in society, but I have huge concerns

Fearfulsaints · 08/11/2025 09:31

I find it can be useful but its very flawed. You have to cross check everything.

I think it tells you what you want to hear. If you point out its missing some information or missed a fact or is wrong some way, it just gushes that you made a great point and its all different than its first suggestion, but you have to know its wrong.

TheMimsy · 08/11/2025 10:02

I’ve been trying to get help from the nhs (can’t afford private therapy) for years for grief around losing my son to treatment resistant paranoid schizophrenia.

He become ill 8 years ago and has been sectioned across 9 units/hospitals without a break for 4 years. It’s unlikely that he will recover. It’s likely that since attempts will one day work. He’s 32 now.

dealing with the admin, finances, support, meetings, the paranoia of my son often targeted at me. Being the only person in the family that can visit now (everyone else is too ‘evil’). It’s constant stress and grief and I now suffer from chronic ailments and recurring depressive episodes.

I wanted to know what I could do to deal with my grief for his lost future, to find acceptance for the situation we are constantly going around in - guilt, grief, anger, regret, sadness etc.

I managed to get 6 weeks of 40 minute generalised counselling 2 years ago. She was nice but it didn’t touch the sides of the barrel of emotions I have. She didn’t really cover grief etc.

I put all this and more into ChatGPT. In one minute it told me that it sounds like ambiguous loss. It gave me examples of others and the history around the term, what can do, support groups I can find, books I could buy. That day I joined two groups and I’ve had the most fantastic support since. The books have taught me coping skills and really helped.

yes there can be issues with ChatGPT due to some users and I’m not sure how they can work that out but their has to be a way with the coding. If you ask it straight out about suicide it doesn’t encourage or support it.

I actually asked ChatGPT about how it could happen and it said this ‘

  • Researchers found that major large-language models (LLMs) can be manipulated to provide instructions for self-harm or suicide, even though they have guardrails. For example, a study by Northeastern University found that by framing a request as “for research/hypothetical”, several AI chatbots changed their responses and bypassed their self-harm protections.

i hope it’s something that can be figured out as our children have enough online issues and risks with some of the forums and online spaces they use encouraging risky or lethal behaviour.

WhatCanISayYoureWelcome · 08/11/2025 10:13

It really is, I only use it to compose important emails. But it has really helped me.

Talkingtomyhouseplants · 08/11/2025 10:15

WhatCanISayYoureWelcome · 08/11/2025 10:13

It really is, I only use it to compose important emails. But it has really helped me.

And you can’t do those yourself? 🙄

Celestialmoods · 08/11/2025 10:21

It is quite amazing, but like most technology, if it’s used right , it’s great. If it’s used wrong, it can be damaging.

wineosaurusrex · 08/11/2025 10:22

I love it! Makes my life so much easier and saves hours.

shuddacuddadidnt · 08/11/2025 10:25

Tickets25 · 08/11/2025 08:48

I know AI can be scary but I've struggled with something for over 30 years and last night chat gpt framed it for me in a way that years of therapy never has and it's given me peace.
It's fabulous.

No point to this post, just wanted to say!

OP, did it tell you what you wanted to hear?

herbaltincture · 08/11/2025 10:26

Celestialmoods · 08/11/2025 10:21

It is quite amazing, but like most technology, if it’s used right , it’s great. If it’s used wrong, it can be damaging.

Edited

The problem is, nobody sets out to "use it wrong". And the wrongness in these cases is emanating from the AI itself, not from the user.

Cellotapecandlestick · 08/11/2025 10:34

I find it very useful for organising information and initial searches for information.

It never the final answer, but I find it saves a big bit of time getting a project started.

Yamadori · 08/11/2025 10:49

I have a niche hobby. I once asked it a pertinent question relating to a particular aspect and about which the general public have misconceptions (and which people new to the hobby don't understand), and it came up with completely the wrong information.

Don't trust it. Don't like it.

Use your brains folks.

walkingmad · 08/11/2025 10:49

It lacks a lot of nuance which is worrying

ProfessorRizz · 08/11/2025 11:00

Talkingtomyhouseplants · 08/11/2025 10:15

And you can’t do those yourself? 🙄

I work in an incredibly busy and time-pressured job, usually 60 hour weeks once research, prep and communication are accounted for. I use ChatGPT to check wording/grammar of important communication in case I’ve missed something because I’ve been boshing through my to-do list. I can do it myself, I’m educated to MSc level and my undergrad was from Oxford, but I need to give myself some grace occasionally.