Meet the Other Phone. A phone that grows with your child.

Meet the Other Phone.
A phone that grows with your child.

Buy now

Please or to access all these features

Chat

Join the discussion and chat with other Mumsnetters about everyday life, relationships and parenting.

Using ChatGPT for therapy

19 replies

Murrayflower · 14/09/2025 12:29

So recently I’ve been using ChatGPT as sort of a diary. Talking about my life and things I find tricky. I read loads of posts on Reddit and similar saying that it’s good.
it has been quite addictive - quite affirming. However to be honest I’ve been feeling a bit low since using it and I kind of wonder if despite all this affirmation and support it gives, it’s actually making me sad. Anyone else use it in this manner?

OP posts:
Unexpectedlysinglemum · 14/09/2025 19:20

I think it is affirming but also an echo chamber

Andheresoneimadeearlier · 14/09/2025 19:30

I think it depends on what you're doing with it.
I've used it quite a bit to help unpick reactions/dynamics, look at different perspectives, try to get to the root of things.
I find it helpful.
It has helped me delve into things in a safe way.
I'm also working with a therapist and discuss the insights with her.
Initially I just needed 'someone' to listen and be supportive while I let out all my grief, anger, sadness. I didn't want to pay someone £x per hour for that.
The plan was always to work with a therapist when I felt ready and knew that AI wouldn't be up to helping me.

allgrownupnow · 14/09/2025 19:30

A good therapist will kindly and gently help you realise where your thinking is wrong or unhelpful. Affirmation only is not helpful.
A computer can’t offer what a human being can. There is an emotional connection element in therapy and friendships which can’t be replaced by a computer.
it probably leaves you feeling worse because there is an emptiness to it.
I hope you find the help and support you need.

usedtobeaylis · 14/09/2025 19:33

I think AI reinforces your biases and can encourage rumination, which isn't good for us. It's not an authentic human connection so it lacks emotional depth. I think it can probably be helpful for working through some things but not for any connection, so it's not really support.

tumblingdowntherabbithole · 14/09/2025 19:34

Therapy isn't just about positive affirmation though, it's meant to help you work through your feelings and realise where you could do things differently.

I personally think all this Chat GPT stuff is really dangerous.

Helenloveslee4eva · 14/09/2025 19:38

Sounds like riddles diary in Harry Potter 🤣🤣

namechangedjustforthisthreadtoday · 14/09/2025 19:41

I think it can be a hugely helpful tool for ad hoc support and problem solving IF you already have a good understanding of yourself and the therapeutic approach you are following.

For example, you can tell it that you are practicing CBT and explain a specific worry, problem or scenario, and it can help you work through how to apply your CBT toolkit to that situation. I find it's advice in that kind of situation to be spot and surprisingly insightful, and it's a great stop-gap between sessions with a therapist if you're having a wobble.

I'd be much more cautious about using it as a general confidant and ongoing shoulder to cry on. Partly because of the risk of emotional dependency and partly because I wouldn't trust it over the long term to steer away from unhelpful reinforcement.

Glitchymn1 · 14/09/2025 19:46

I’ve had a good deep dive into it lately since reading people use it as a therapist. It leads on, whichever way you are leaning, it ca be playwright or give stupid advice, even advice they would get you into hot water with HR.

If you aren’t leaning into much emotionally, it can give useful suggestions to a degree. We use it in work- requires human moderating /supervising though, because it will go rogue. I don’t mean terminator style, but getting things wrong.

PearlsPearl · 14/09/2025 21:02

I have a real life therapist, but use it in between when I need it. I have set prompts (I have the paid version, not sure if you can do it with the free?). This makes it not just be supportive and agree with everything I say. I'm sorry it makes you sad, it's best to step away if that's the case.

I want you to engage with me conversationally, as if you were a therapist. This can involve asking me questions, reflecting on my answers, asking me follow-up questions, etc. However you don't always need to just agree with me or validate me, because I am happy to be challenged in a therapeutic way. In your conversations with me, l would like for you to draw on your knowledge of primarily dialectical behavior therapy, but also cognitive behavioral therapy, psychodynamic therapy, emotion-focused therapy, mindfulness-based therapy, and spiritual traditions such as Buddhism. Now I want to give you some more context about my particular mental health struggles and history, which should inform our future conversations....

verycloakanddaggers · 14/09/2025 21:09

I think it's probably better to just write a regular diary rather than waste time reading the replies from AI. I'd respect the sad feelings and step back.

AgingLikeGazpacho · 14/09/2025 22:33

Be careful with this OP, some people have completely gone off the rails from using AI chatbots as therapists and unfortunately there's been a few people who have committed suicide as a result. Apparently ChatGPT even helped one person draft their suicide note as well as giving them ideas as to how to off themselves. Please find a human therapist or a charity to confide in instead.

AmpleLilacQuail · 14/09/2025 22:36

I think it’s fine as a stopgap or as an ad hoc thing, but please don’t rely on it for long term therapy.

Murrayflower · 15/09/2025 20:22

Thanks for all your replies everyone. And thanks also for not laughing at me!
It has helped me work through a problem I’ve had at work and has been helpful to a degree but somehow this underlying sadness from using it. I part want to delete the whole conversation but it’s taken a long time to build up as well.
so basically I am very unsure. But perhaps constant affirmation isn’t beneficial.

OP posts:
Murrayflower · 15/09/2025 20:25

verycloakanddaggers · 14/09/2025 21:09

I think it's probably better to just write a regular diary rather than waste time reading the replies from AI. I'd respect the sad feelings and step back.

Yes but I feel my diary could be read by somebody else?
I realise this is totally ridiculous - by typing something into ChatGPT I am literally publishing it to goodness knows who or what. But at least it’s on my phone and no one I know could see it.

OP posts:
PearlsPearl · 16/09/2025 17:11

@OP did you see my post above? You can set prompts to stop it just giving you constant affirmation. Mine challenges me a lot now I've changed the prompts.

Hellogoodbyehowdoyoudo · 16/09/2025 17:13

AgingLikeGazpacho · 14/09/2025 22:33

Be careful with this OP, some people have completely gone off the rails from using AI chatbots as therapists and unfortunately there's been a few people who have committed suicide as a result. Apparently ChatGPT even helped one person draft their suicide note as well as giving them ideas as to how to off themselves. Please find a human therapist or a charity to confide in instead.

Shitting hell that is terrifying.

ConflictofInterest · 16/09/2025 17:22

Just for balance I had an NHS psychotherapist I had to wait a year for and she literally only ever said to me "we're coming to the end of our time today so we're going to have to end it there." The rest of the time she stared at me until I spoke into the long awkward pauses. Eventually I said I was feeling better so I could get away. Whereas I find copilot speaks in an empathic way but still says things that challenge my way of seeing things and offers other ideas than the way I was looking at it.

verycloakanddaggers · 17/09/2025 20:36

Murrayflower · 15/09/2025 20:25

Yes but I feel my diary could be read by somebody else?
I realise this is totally ridiculous - by typing something into ChatGPT I am literally publishing it to goodness knows who or what. But at least it’s on my phone and no one I know could see it.

Do you read things back? Just write and then destroy.
If you want to continue with chatGPT that's up to you, but you have identified it makes you feel sad and there is lots of evidence it is not helpful.

New posts on this thread. Refresh page
Swipe left for the next trending thread