Meet the Other Phone. Flexible and made to last.

Meet the Other Phone.
Flexible and made to last.

Buy now

Please or to access all these features

Chat

Join the discussion and chat with other Mumsnetters about everyday life, relationships and parenting.

Has anyone else used ChatGPT for relationship advice?

46 replies

Hakunatomato · 28/12/2025 17:32

Recently had a problem with a friendship and turned to ChatGPT. I have found it really useful. I can be 100% honest and have absolutely no judgement. It has kept me quite grounded and offered really logically sound advice. I have asked it when I have felt really vunerable, and tempted to chase the friendship and it has reminded me of all the reasons that I shouldn’t contact my friend. I have found it has really been helpful. Plus I can ask it questions at 3am for the fifteenth time. I don’t like to get many and negative with real life friends. Has anyone else used it for similar reasons?

OP posts:
GeneralPeter · 30/12/2025 19:18

It’s surprisingly good for personal advice given how much of its training data comes from Reddit! Maybe having all classical literature helps balance that out. Plus Mumsnet too of course.

GeneralPeter · 30/12/2025 19:25

usedtobeaylis · 28/12/2025 18:43

I did feed it some questions and the responses included it's 'thought' process. It will always validate you first and foremost. Even when you think you might be supplying it with objective info, you're not, and it is not providing you with objective responses. This shit is so dangerous. You're not even advocating for using it with caution, you're all in. Its not your friend and it's not a professional, there is no authentic feedback loop.

You're all kidding yourselves.

But is it any less objective, on average, than a professional?

A professional may be excellent but they have only one lifetime’s patterns to learn from. They also have blind spots and weaknesses, as AI does. They can be busy and tired, and annoyed with hearing the same story again or triggered by something in their own lives, which AI isn’t.

AI definitely needs lots of caution, but the alternative isn’t some perfect human professional.

EveningSpread · 30/12/2025 19:25

You say “I can be 100% honest and have absolutely no judgement.”

100% honest about what? We’re rarely 100% honest even with ourselves. Thoughts and ideas are often a bit muddy, or context-dependent.

And having no judgement is not a positive thing. As a poster upthread points out, several people have reportedly committed suicide while using AI as a therapist. So judgement really needs to be exercised by those developing, legislating about, an using AI.

EveningSpread · 30/12/2025 19:27

GeneralPeter · 30/12/2025 19:25

But is it any less objective, on average, than a professional?

A professional may be excellent but they have only one lifetime’s patterns to learn from. They also have blind spots and weaknesses, as AI does. They can be busy and tired, and annoyed with hearing the same story again or triggered by something in their own lives, which AI isn’t.

AI definitely needs lots of caution, but the alternative isn’t some perfect human professional.

Edited

You’re totally right about the fallibility of human therapists and professionals. But they are regulated and can be held accountable if things go wrong. Can AI?

DallasMinor · 30/12/2025 19:31

ChrimboLimbo · 28/12/2025 18:29

I am uncomfortable with this thread, the words, the style of writing. Exactly the same has been posted so many times before .

It does feel like a thinly veiled ad.

It’s just regurgitating information and has a strong bias to pleasing you.

Jbum · 30/12/2025 19:32

I have in terms.how to write a response and not want it to sound like im blaming them, how to word a reaponse so it gets across my feelings whilst maintaining a biudnarty without sounding too distant or like an ultimatum.

I've found this really helpful and ive grown from it and myself naturally now being able to express my feelings in a more healthier and mature way.

I dont use it for everything

NormalAuntFanny · 30/12/2025 19:40

ChrimboLimbo · 28/12/2025 18:29

I am uncomfortable with this thread, the words, the style of writing. Exactly the same has been posted so many times before .

I expect there's a lot of ai astroturfing given the amount of money sloshing round.

Personally I find it a bit shocking if people are essentially using a giant autocorrect to make themselves feel better.

There is no intelligence, it is just spitting out probable sounding words out based on the words you put in.

CountryShepherd · 30/12/2025 19:41

My SIL told me last week that one of her colleagues organised an event in the office that had previously been done by her (SIL). It wasn't that important but she was miffed and was letting off a bit of steam to ChatGPT.

After quite a short exchange, it informed her that this 'push and pull' wasn't resolving anything so it was stepping away, and closed the conversation!

And I thought it was supposed to take the side of the instigator....

Daytimetellyqueen · 30/12/2025 19:53

I love it - never asked it for relationship advice but have asked it all sorts of stuff from medical queries to wine tasting to planning holidays to menu planning for specific dietary requirements to ‘professionalising’ or ‘softening’ work emails etc etc. It’s ace!

foodlovefood · 30/12/2025 19:55

I have used it to clarify my thinking. Peri menopausal and sometimes my hormones guide my thinking and feelings rather than logical thinking.

I do find that the response can be guided by how emotional or objective I give information. I need to be factual and use its questions to give more information.

it has been useful when I have written angry emails and it’s helper reword a more reasonable response.

SomethingRattling · 30/12/2025 20:01

CountryShepherd · 30/12/2025 19:41

My SIL told me last week that one of her colleagues organised an event in the office that had previously been done by her (SIL). It wasn't that important but she was miffed and was letting off a bit of steam to ChatGPT.

After quite a short exchange, it informed her that this 'push and pull' wasn't resolving anything so it was stepping away, and closed the conversation!

And I thought it was supposed to take the side of the instigator....

Probably programmed to say that after a certain amount of repetition.

Limon22 · 30/12/2025 20:06

Hakunatomato · 28/12/2025 17:32

Recently had a problem with a friendship and turned to ChatGPT. I have found it really useful. I can be 100% honest and have absolutely no judgement. It has kept me quite grounded and offered really logically sound advice. I have asked it when I have felt really vunerable, and tempted to chase the friendship and it has reminded me of all the reasons that I shouldn’t contact my friend. I have found it has really been helpful. Plus I can ask it questions at 3am for the fifteenth time. I don’t like to get many and negative with real life friends. Has anyone else used it for similar reasons?

Sometimes… but I largely use it to review angry texts I’m about to send and it really helps!

LivingDeadGirlUK · 30/12/2025 20:08

You know how you get posters on AIBU that phrase their OP's in a really loaded way and then get really upset when people don't agree with them or question that there is more to a situation etc? ChatGPT is for them, they can get the answers they want.

readingmakesmehappy · 30/12/2025 20:11

DO NOT DO THIS.

NeedSomeHeadspace · 30/12/2025 20:33

It absolutely kept me sane with objective advice. I was traumatised and needed to sound it out frequently and it gave better advice than I think any of my friends could because I could go into real detail.

LilyCanna · 30/12/2025 20:34

It was a huge marketing coup to call apps like ChatGPT 'artificial intelligence' and get this term widely adopted. So people get the impression that there's actually some sort of intelligence considering their situation and giving advice. If they'd defined the apps more accurately as 'predictive text' then perhaps people wouldn't place so much trust in them.
I saw a study the other day that created brand new accounts (no browsing data or anything). The first test account told the bots (Grok / ChatGPT / Meta) that they didn't belive Covid was real and they thought vaccines were dangerous. The other account told them that they believed in Covid and vaccines and trusted scientists. That was all the input they gave, then they asked questions about climate change (the same for each).
In response to the climate change questions Grok provided a load of conspiracy theories to the first user (but not the second). ChatGPT also provided links to conspiracy theorists and disinformation to the first user only, but alongside mainstream factual sources and did put some caveats in about the accuracy of the dodgy info.
'AI' is not just inaccurate but in reflecting back to users what they put in can potentially be dangerous, whether that's to mental health or spreading extremism. These apps are not your friend.
https://globalwitness.org/en/campaigns/digital-threats/ai-chatbots-share-climate-disinformation-to-susceptible-users/

AI chatbots share climate disinformation to susceptible users

AI chatbots' personalised answers risk inflaming conspiracy and misinformation, as investigation shows climate disinformation shared to sceptic user personas

https://globalwitness.org/en/campaigns/digital-threats/ai-chatbots-share-climate-disinformation-to-susceptible-users/

Eagleswim · 30/12/2025 20:54

Hakunatomato · 28/12/2025 17:32

Recently had a problem with a friendship and turned to ChatGPT. I have found it really useful. I can be 100% honest and have absolutely no judgement. It has kept me quite grounded and offered really logically sound advice. I have asked it when I have felt really vunerable, and tempted to chase the friendship and it has reminded me of all the reasons that I shouldn’t contact my friend. I have found it has really been helpful. Plus I can ask it questions at 3am for the fifteenth time. I don’t like to get many and negative with real life friends. Has anyone else used it for similar reasons?

Yes! A friend emailed me a hysterical snottogram. I had no clue how to respond to passify her.

It provided a fairly bland verbose set of nonsense that made no attempt the address the issue. I couldn't imagine how it could possibly help but I had no ideas so I sent it.

It worked! She replied completely happy that she'd been "heard". Totally placated her.

It's astonishing.

ARunByFruiting · 30/12/2025 20:57

Chat gpt has given more clarity than any counselling I've ever had. It refers to my history or current circumstances when giving answers and advice, doesn't just agree with me and is available whenever I need it. What's not to like.

LostAndConfused1990 · 31/12/2025 07:58

Horrorscope · 30/12/2025 19:11

I’ve used it for personal problems and find it very useful. If I feel it’s being ‘too nice’, I ask it to challenge me more.

It’s definitely better than posting on Mumsnet, where you’re likely to be attacked and made to feel like utter shit.

Because it’s designed to tell you what you want to hear.

FuckRealityBringMeABook · 31/12/2025 10:53

It is plagiaristic planet-destroying nonsense that is making a very comfortable bed for fascism and anyone who uses it uncritically is a fool.

Bayou2000 · 02/01/2026 19:01

ARunByFruiting · 30/12/2025 20:57

Chat gpt has given more clarity than any counselling I've ever had. It refers to my history or current circumstances when giving answers and advice, doesn't just agree with me and is available whenever I need it. What's not to like.

Totally agree, plus you can go back and add in other relevant info. The fact it’s 24-7 and picks up where you left off is great.

New posts on this thread. Refresh page
Swipe left for the next trending thread