Meet the Other Phone. A phone that grows with your child.

Meet the Other Phone.
A phone that grows with your child.

Buy now

Please or to access all these features

Feminism: Sex and gender discussions

MN targeted wtih abuse images

20 replies

ArabellaScott · 04/02/2025 16:13

https://www.bbc.co.uk/news/articles/c93qw3lw4kvo

Front page BBC.

'Parenting site Mumsnet says it has stopped users from sharing pictures after it was targeted with images of child sexual abuse.
Company founder Justine Roberts told the BBC the "horrific incident" had been reported to police after the images were posted on the platform over several hours late on Sunday.
It has now suspended the facility to post pictures on the site as a temporary measure and is planning to introduce artificial intelligence (AI) filters to flag "illegal" and "disturbing" images before they appear.'

A woman - who is anonymous in the image and only her arms and hands can be seen - types on a laptop

Mumsnet: Parenting site targeted with child sexual abuse images

Company founder Justine Roberts tells the BBC the "horrific incident" has been reported to the police.

https://www.bbc.co.uk/news/articles/c93qw3lw4kvo

OP posts:
ArabellaScott · 04/02/2025 16:14

'Ms Roberts said the company was "regularly subject to threats and attacks from people who seem to want to derail the conversation on site".
"This latest horrific incident feels like another attempt to do the same," she added.
"Over the years we've been swatted [fake calls to the police], attacked by bots and suffered bomb threats amongst other things.
"Right now our team is very focused on preventing any further such horrific images ever appearing on site and helping the police in their attempts to find the person or persons who posted them."
Ms Roberts said moderators who viewed the images of child abuse have been offered support.
Mumsnet was founded by Ms Roberts in 2000 as an internet forum for parents to swap advice. It says it now has around nine million unique users per month.'

OP posts:
Shortshriftandlethal · 04/02/2025 16:18

What are the chances of finding the perpetrator? Hopefully he can be tracked down and publicly exposed.

Obscurial · 04/02/2025 16:19

I wonder if they’ll consider paid moderators 24 hours a day now?

I feel lucky that I didn’t see anything, but if MN staff need support what about those online at the time? A paid moderator could have swiftly locked down the site immediately, unlike a volunteer night shift, with limited tools.

Bannedontherun · 04/02/2025 16:20

Thanks Arabella for your sharp and focussed eyes and mind.

wrongthinker · 04/02/2025 16:51

Couple of threads on this in site stuff. MN have been closing threads all day because the incident revealed they had no system for dealing with it and overnight volunteer moderators were asking users to share images. There were images on the site for over 5 hours. The report button isn't responded to overnight so volunteers were trying to hide threads. They're untrained and I should think extremely traumatised.

ArabellaScott · 04/02/2025 17:01

NoIncomeTaxNoVAT · 04/02/2025 16:55

Thanks. I think all users should be being made aware - not many see stuff in Site Stuff.

OP posts:
Maaate · 04/02/2025 17:09

ArabellaScott · 04/02/2025 17:01

Thanks. I think all users should be being made aware - not many see stuff in Site Stuff.

This is a point that lots of people have been making on the new threads - so many are not aware of what happened

SpiderPigSpiderPigDoesWhateverASpiderPigDoes · 04/02/2025 17:20

wonder if they’ll consider paid moderators 24 hours a day now?

I don't think so as she said that had it happened during office hours the pictures would still have been posted and that they are going to focus on stopping things getting posted in the first place using AI.

IwantToRetire · 04/02/2025 17:21

Someone started a thread about this yesterday on FWR but it was so obscure I didn't understand what they were talking about.

But if they are posted on a thread they must have user name and email address so can be identified (to the police?).

IwantToRetire · 04/02/2025 17:23

But also there is a concern currently about MN technical ability as it seems there is a security issue in the advertising stream they use that is causing lots of problems with users finding they are being hijacked to visit inappropriate sites and / or potentially download virus sw.

SpiderPigSpiderPigDoesWhateverASpiderPigDoes · 04/02/2025 17:24

IwantToRetire · 04/02/2025 17:21

Someone started a thread about this yesterday on FWR but it was so obscure I didn't understand what they were talking about.

But if they are posted on a thread they must have user name and email address so can be identified (to the police?).

Well no because you can open a new email address in ninety seconds.

In fact, users were advised after the Great Hacking Scandal not to use their regular email address as every posters email address and password were leaked.

IwantToRetire · 04/02/2025 17:34

Well no because you can open a new email address in ninety seconds

Yes I know, but if the police are involved they can ask and get further information and who created an address.

ie indicating if it is a lone individual or a group of people.

But suspect in terms of police time, this will not be something given much priority.

Maaate · 04/02/2025 18:41

SpiderPigSpiderPigDoesWhateverASpiderPigDoes · 04/02/2025 17:20

wonder if they’ll consider paid moderators 24 hours a day now?

I don't think so as she said that had it happened during office hours the pictures would still have been posted and that they are going to focus on stopping things getting posted in the first place using AI.

The difference is that mods have the power to delete posts/threads and ban users if this is tried during the day.

In this case it happened at night when no mods are on duty and the volunteer night watch members did not have the ability to delete or ban. They had to be linked to the threads/posts directly to hide them but that did nothing to stop the filth continuing to post on other threads.

Basically wackamole whilst exposing users to horrific CSA images.

SpiderPigSpiderPigDoesWhateverASpiderPigDoes · 04/02/2025 18:47

I know that. But that's what the press statement says.

Pickandmixusername · 04/02/2025 18:49

Obscurial · 04/02/2025 16:19

I wonder if they’ll consider paid moderators 24 hours a day now?

I feel lucky that I didn’t see anything, but if MN staff need support what about those online at the time? A paid moderator could have swiftly locked down the site immediately, unlike a volunteer night shift, with limited tools.

Agree. It's ridiculous mods don't get paid for their work. Wtf?

Truthlikeness · 04/02/2025 19:53

This attack was basically terrorism - traumatising women or creating the fear of seeing extremely disturbing images to drive women away from organising and talking about things that someone does not want them to. I hope the police take it extremely seriously - above and beyond dealing with possession of CSA images, which sadly these days seems to carry little consequence.

The extensive talk about what Mumsnet should have done is distracting from the actual crime. As Justine pointed out - many large discussion forums use volunteer moderators. Reddit makes over $800m a year and does not pay their mods.

IwantToRetire · 04/02/2025 20:04

This attack was basically terrorism - traumatising women or creating the fear of seeing extremely disturbing images to drive women away from organising and talking about things that someone does not want them to

This - but also no doubt revelling in sharing and promoting CSA. a message to women that you cant stop men doing what they want.

Pickandmixusername · 04/02/2025 20:07

Truthlikeness · 04/02/2025 19:53

This attack was basically terrorism - traumatising women or creating the fear of seeing extremely disturbing images to drive women away from organising and talking about things that someone does not want them to. I hope the police take it extremely seriously - above and beyond dealing with possession of CSA images, which sadly these days seems to carry little consequence.

The extensive talk about what Mumsnet should have done is distracting from the actual crime. As Justine pointed out - many large discussion forums use volunteer moderators. Reddit makes over $800m a year and does not pay their mods.

Hmm true but I don't think reddit is anything to aspire to. Their moderation is a hop skip and a jump away from non existent. Actually, if you even mention some of the reddit boards on here, they are automatically filtered out by mumsnet. So I'm not sure "well reddit do it too" is a strong argument.

I'm honestly just a bit sick of online platforms making money out of the people who use them and then refusing to take any responsibility when something horrible happens 🤷‍♀️

BoreOfWhabylon · 04/02/2025 21:46

Maaate · 04/02/2025 18:41

The difference is that mods have the power to delete posts/threads and ban users if this is tried during the day.

In this case it happened at night when no mods are on duty and the volunteer night watch members did not have the ability to delete or ban. They had to be linked to the threads/posts directly to hide them but that did nothing to stop the filth continuing to post on other threads.

Basically wackamole whilst exposing users to horrific CSA images.

The Night Watch can and do delete posts and threads and temporarily ban users from posting until MNHQ are around. What they don't get is the reports, which is why they rely on being given a 'heads up' via the Night Watch board.

New posts on this thread. Refresh page