Meet the Other Phone. Protection built in.

Meet the Other Phone.
Protection built in.

Buy now

Please or to access all these features

Site stuff

Join our Innovation Panel to try new features early and help make Mumsnet better.

See all MNHQ comments on this thread

Update from MNHQ addressing the recent images posted on the site

487 replies

JustineMumsnet · 04/02/2025 12:13

Hi all,
There have been a number of threads discussing what happened re the posting of CSA images on site and so I want to be absolutely clear: we would never seek to shut down reasonable criticism and we're taking on board all the feedback and will be carefully considering how we can improve our procedures and moderation to prevent this happening again.

As a temporary measure, we have suspended all image posting and will soon be implementing AI filters to flag illegal/disturbing images before they appear on the site. We’re also liaising with external specialists to see if there are any further tools we can employ to help us prevent this from happening again in the future. We reported it to the police first thing the morning after the attack and we have a follow up meeting with them tomorrow.

It's also pretty clear from what we can see behind the scenes that there is an ongoing, coordinated effort from trolls to further inflame these discussions, and cause as much disruption as possible. We are taking steps to remove bad faith actors, but we know this can be frustrating for those who just want to express your views about what happened. If you come across any posts that seem designed to stir up more conflict rather than contribute constructively, please do report them - your reports really help us to act swiftly. Many thanks for your patience while we work to sort this out.

OP posts:
Thread gallery
5
myplace · 04/02/2025 13:51

It’s important to recognise there ARE bad faith actors, and avoid helping them with their work.

Inevitably there will be a few too many deleted and a few too many left up. No system is going to be perfect, especially on day 2 of the incident.

BarbaraHoward · 04/02/2025 13:52

Frequency · 04/02/2025 13:44

Using volunteer mods is common on chat sites, especially quieter ones which MN is out of hours as most users are in the UK/Europe.

It can be helpful for the volunteers, especially if they are trying to rejoin the workforce or get into media/web dev as a career, in terms of work experience and references however, there should be a system in place with someone paid on call who the volunteers can reach out in an emergency like this one and they should have the ability to suspend posters and view reports.

And that's fine for small groups run for fun.

MN is a massive business with turnover and profit in the millions. Users' content has made its founders very wealthy indeed.

It's not unreasonable to expect such a big business to be run professionally.

MaMisled · 04/02/2025 13:53

Thank you for the update and thank you night volunteers, it must have been completely awful.

I do hope you will offer an official apology that this was possible on a huge, respected international forum.

I know I'm not alone in having the one image I saw for a split second seared in my mind. It keeps appearing and, here I am, feeling shame and guilt because I can't stop seeing it. I feel terrible because it's there. I feel poisoned.

My thoughts are with the victims. My feelings are but one grain of sand in their desert of pain.

oakleaffy · 04/02/2025 13:54

MariaThomasFangs · 04/02/2025 13:46

These are usually smaller sites that don't make millions in profit though. I think if you're making as much money as MN is, you can afford to pay for overnight support, especially since no one volunteering should have to deal with things like CSA images.

Even paid mods oughtn’t be able to see illegal CSA images though.

Even £1,000 a night would not be enough for many to see illegal images.

Al vetting is needed for images.

Safe Search on Google has kept me from seeing inappropriate images - so obviously such software exists.

However “ safe search” on google doesn’t work if people post illegal images to a forum.

MariaThomasFangs · 04/02/2025 13:55

oakleaffy · 04/02/2025 13:54

Even paid mods oughtn’t be able to see illegal CSA images though.

Even £1,000 a night would not be enough for many to see illegal images.

Al vetting is needed for images.

Safe Search on Google has kept me from seeing inappropriate images - so obviously such software exists.

However “ safe search” on google doesn’t work if people post illegal images to a forum.

Edited

Yes, I do agree with that. No one should have to see it, but to expect unpaid volunteers to be dealing with such things is beyond the pale. I'm glad MN are going to look into some tech solutions to stopping the images before they can be posted at all.

SDTGisAnEvilWolefGenius · 04/02/2025 13:56

Hopefully this will be one of the issues that MNHQ are talking to the experts about, @oakleaffy.

wrongthinker · 04/02/2025 13:57

RadFs · 04/02/2025 13:50

What’s this all about? Seems like I missed something

MN have been closing down threads discussing the situation. I'm not sure why - maybe to keep it out of the press? But users aren't aware that reports made overnight (uk) are not seen until the morning so if something like this happens again, then they don't know how to report.

I think people would be reassured by more transparency. I understand that there are trolls and bad actors but by not communicating with users, I fear MN are playing into their hands.

RainbowZebraWarrior · 04/02/2025 13:58

WhoPutTheBomp · 04/02/2025 13:44

Just as a point of interest here's a link to the kickoff of NightWatch:

www.mumsnet.com/talk/the_night_watch/2015451-Night-Watch-begins-this-evening

Thanks for this.

Something that has often crossed my mind (knowing that the volunteer night watch were largely recruited from 'folk often on MN in the wee hours') There's a big difference between often being up and online in the wee hours, and being obligated to moderate a forum. Are these people in different time zones? (doubt it) or users with health issues, sleep disorders, or insomnia, for example? In which case, is that / was that ever really an ethical thing to do?

Just to add, this is a genuine question, I'm a regular poster of many years, and I've always felt uncomfortable about the use of NW. I've also been a moderator volunteer on a much smaller site before, so it's really opened my eyes to the risks.

Joker01 · 04/02/2025 13:58

Cremeeggtime · 04/02/2025 12:35

your reports really help us to act swiftly
Somewhat ironic given a big issue is that reports made during the night don't get seen.

I agree. Sorry @JustineMumsnet this isn’t really addressing the issue. It’s not good enough.

wrongthinker · 04/02/2025 13:59

MaMisled · 04/02/2025 13:53

Thank you for the update and thank you night volunteers, it must have been completely awful.

I do hope you will offer an official apology that this was possible on a huge, respected international forum.

I know I'm not alone in having the one image I saw for a split second seared in my mind. It keeps appearing and, here I am, feeling shame and guilt because I can't stop seeing it. I feel terrible because it's there. I feel poisoned.

My thoughts are with the victims. My feelings are but one grain of sand in their desert of pain.

Someone mentioned on the other thread that playing tetris can help. Sounds nuts but there's some solid research backing it up. Can't hurt, anyway. Hope you're okay.

Joker01 · 04/02/2025 14:00

I still think that alongside the AI monitoring there should be a delay in pictures being posted so that they can be checked before going live. If the picture is flagged then even the mods don’t have to allow it, they can disallow it straight away. That way no bad photos have the chance to go live.

WhatWasPromised · 04/02/2025 14:06

Firstly thank you for the update, however I do find it slightly surprising that it’s taken this long to release ANY kind of statement on it.

I have no idea of the traffic overnight on here and whether or not volunteers to moderate is appropriate or not (based on an industry standard) but it’s clear it’s not enough. It’s staggering to me (and this isn’t a criticism of the volunteers, but more of MNHQ) that they appear to have no idea how to contact someone who can actually resolve issues?!

I am also another user who had no idea the report function doesn’t work overnight.

AutumnFroglets · 04/02/2025 14:06

It's also pretty clear from what we can see behind the scenes that there is an ongoing, coordinated effort from trolls to further inflame these discussions, and cause as much disruption as possible.

Even reading as a normal person you could tell there were certain agitators who were going over the top and trying to get a lynch mob going. They are just as sick in the head imo trying to take advantage of such a horrific situation.

Flowers to all the night staff

oakleaffy · 04/02/2025 14:07

Joker01 · 04/02/2025 14:00

I still think that alongside the AI monitoring there should be a delay in pictures being posted so that they can be checked before going live. If the picture is flagged then even the mods don’t have to allow it, they can disallow it straight away. That way no bad photos have the chance to go live.

Do we really even NEED images?

Often it’s a link people post to ( a house for sale or whatever)

People have posted pics of their pets or a parking diagram , but I’d rather see zero images than to give criminals the chance to post illegal images.

EasternStandard · 04/02/2025 14:09

Thanks for the update

I'm glad you've used the term CSA in this post as I found the last one with vile images upsetting in terms of minimising

My query is the same as pp, will night watch moderators be paid?

Joker01 · 04/02/2025 14:10

oakleaffy · 04/02/2025 14:07

Do we really even NEED images?

Often it’s a link people post to ( a house for sale or whatever)

People have posted pics of their pets or a parking diagram , but I’d rather see zero images than to give criminals the chance to post illegal images.

That’s a fair point. I think people use them for diagrams or photos of themselves, but even links could be clicked upon to lead to unpleasant media. For me, AI isn’t good enough on its own. There needs to be better monitoring of the site to make sure that these things are dealt with as swiftly as possible should the site be infiltrated - no matter what medium they are coming from. It needs to be a multifaceted defence, rather than just ‘We’re doing AI’. This has shown a serious breach in Mumsnet’s defences and they need to be doing much more.

JaneJeffer · 04/02/2025 14:12

Do we really even NEED images?
Yes we do. I wanted to post one on a thread just now but obviously can't at the moment.

MyNameIsX · 04/02/2025 14:15

Ironically, its during times like these when my faith in human nature is restored, given the number of good people on MN, who speak out against such abhorrent behaviour.

We stand together.

BarbaraHoward · 04/02/2025 14:16

Joker01 · 04/02/2025 14:10

That’s a fair point. I think people use them for diagrams or photos of themselves, but even links could be clicked upon to lead to unpleasant media. For me, AI isn’t good enough on its own. There needs to be better monitoring of the site to make sure that these things are dealt with as swiftly as possible should the site be infiltrated - no matter what medium they are coming from. It needs to be a multifaceted defence, rather than just ‘We’re doing AI’. This has shown a serious breach in Mumsnet’s defences and they need to be doing much more.

Edited

Allowing users to post photos is very basic functionality. Pausing that for a few days now while they get on top of things is one thing, but moving backwards in terms of the functionality would be another. Users will reasonably expect to be able to post a parking diagram/photo of their decor/picture of their desired haircut.

Automated screening of images to prevent unsuitable things being posted would be great. However that will never catch everything - someone will always find a way through. The real disgrace is that when it happened, it was unpaid volunteers who had to deal with it with completely inadequate tools and training.

JustineMumsnet · 04/02/2025 14:20

We understand the concern around moderation overnight, and we want to clarify a few things.

First, it’s always worth reporting posts, no matter the time of day. We have a number of automated moderation controls in place that operate overnight, and we are actively working to strengthen these further. While we won’t go into detail because we don't want to give troublemakers intel on how to avoid detection, we want to reassure you that reports are an effective way of flagging issues at any time of day or night.

NightWatchers are long-standing, experienced Mumsnet users who live in different timezones and who very kindly have offered to help keep an eye on things overnight. They do have some admin tools to manage situations (they can hide posts and threads and suspend users) and we are looking at ways in which we can augment these tools, including improving alerts when a thread is being widely reported.

It’s worth noting that many large platforms rely almost entirely on volunteer moderators.

The issue on Sunday night wasn’t night watch controls or moderation - it was that these images could be posted at all. Even if this had happened during the day, they still would have appeared on the site. That’s why our focus is on preventing this kind of content from being uploaded in the first place.

We know this has been a distressing situation, and I promise we’re working hard to ensure it doesn’t happen again. Thanks to everyone who has taken the time to share feedback - we do take it on board, and we appreciate it.

OP posts:
RainbowZebraWarrior · 04/02/2025 14:21

JaneJeffer · 04/02/2025 14:12

Do we really even NEED images?
Yes we do. I wanted to post one on a thread just now but obviously can't at the moment.

I see some people have been using postimg.cc in order to link to photos (I think it works like Flickr whereby you upload a photo, then provide a link to the photo here) I believe the site doesn't allow illegal images to be uploaded, so it is safe.

I Googled and it's whole MO is about 'safe images' which sounds like it could be the answer.

BarbaraHoward · 04/02/2025 14:23

They do have some admin tools to manage situations (they can hide posts and threads and suspend users) and we are looking at ways in which we can augment these tools, including improving alerts when a thread is being widely reported.

So no change then.

RainbowZebraWarrior · 04/02/2025 14:23

Thanks very much for the latest update @JustineMumsnet it's really helpful.

oakleaffy · 04/02/2025 14:28

Joker01 · 04/02/2025 14:10

That’s a fair point. I think people use them for diagrams or photos of themselves, but even links could be clicked upon to lead to unpleasant media. For me, AI isn’t good enough on its own. There needs to be better monitoring of the site to make sure that these things are dealt with as swiftly as possible should the site be infiltrated - no matter what medium they are coming from. It needs to be a multifaceted defence, rather than just ‘We’re doing AI’. This has shown a serious breach in Mumsnet’s defences and they need to be doing much more.

Edited

That’s true re links- but hopefully they wouldn’t be darkweb type stuff.

I’m not tech savvy at all- but have noticed Mumsnet seem to get issues with tech ( massive ads that obliterate text or else some tech savvy kid has a field day .

They really do need to invest in some serious protection that is regularly updated

  • I remember a hack of emails and passwords on here years ago that led to major spamming of users.

Hopefully the site will be be made much more secure and “safe”.

JaneJeffer · 04/02/2025 14:29

RainbowZebraWarrior · 04/02/2025 14:21

I see some people have been using postimg.cc in order to link to photos (I think it works like Flickr whereby you upload a photo, then provide a link to the photo here) I believe the site doesn't allow illegal images to be uploaded, so it is safe.

I Googled and it's whole MO is about 'safe images' which sounds like it could be the answer.

I believe the site doesn't allow illegal images to be uploaded, so it is safe.
So why can’t MN use the same type of technology? We shouldn’t have to go through a third party just to post a photo.