Meet the Other Phone. Protection built in.

Meet the Other Phone.
Protection built in.

Buy now

Please or to access all these features

Site stuff

Join our Innovation Panel to try new features early and help make Mumsnet better.

See all MNHQ comments on this thread

It's time to rethink the Night Watch volunteer system

1000 replies

TwigletsAndRadishes · 03/02/2025 05:06

After the truly awful photos that have been posted on MNin the early hours of this morning, which I will never un-see, I think you need to have a different policy for overnight moderation.

The site is big enough and money-making enough to support a better system by now. There really needs to be proper 24 hour IT cover. At the very least, the Night Watch volunteers should be able to call a member of MNHQ who is on call overnight who has the ability to either deal with the issue or shut the whole forum down until it can be dealt with properly by the IT bods in the morning.

It's only been a handful of threads affected this evening, but it could potentially be a major spamming of dozens or even hundreds of threads next time. This site is used in other parts of the world in different time zones where more people risk seeing the grossly disturbing content than a relative handful of us up at 3am in the UK.

The current system relies on posters reporting any offending theads and hoping that a volunteer will pick up the report in good time and hide the thread. This morning one of the threads affected took far too long to be hidden, not that I am blaming an unpaid volunteer for that.

OP posts:
Thread gallery
14
RedLightsStopSigns · 03/02/2025 13:34

This is the push that I needed to take a step back from MN. I’m lucky that I wasn’t on here last night and I didn’t see the threads in question. But no way am I risking seeing something like that.

Growlybear83 · 03/02/2025 13:35

Thankfully I didn't see the posts concerned, although I saw a thread talking about them when I went to bed at 1. But apart from the obvious outrage about the images that were posted and the language used, am I right in thinking that everyone who opened the threads and saw the pictures concerned will now have images of child sexual abuse in their browsing history?

OwlInTheOak · 03/02/2025 13:36

Surely there could be some sort of AI technology to screen for inappropriate photos. It's definitely possible/available given how FB blurs certain content, and I can't see any reason not to implement it.

myplace · 03/02/2025 13:36

We don’t know that MN has no suitable measures in place. We do know that their measures didn’t work. They will need to work out why and what they can improve to reduce the likelihood of a recurrence and to ensure a better response when something does happen again.

And I would expect them to take time to fully explore the causes and implications of what happened before a full response.

wrongthinker · 03/02/2025 13:37

FrustratedandBemused · 03/02/2025 13:31

I reported at the same time and have had the same email from MN, complete with the wording.

WTF.

This is such a horrible, horrible situation. MN have handled it so badly. I hope the night watch volunteers are getting proper, professional support at least.

swallowedAfly · 03/02/2025 13:38

Growlybear83 · 03/02/2025 13:35

Thankfully I didn't see the posts concerned, although I saw a thread talking about them when I went to bed at 1. But apart from the obvious outrage about the images that were posted and the language used, am I right in thinking that everyone who opened the threads and saw the pictures concerned will now have images of child sexual abuse in their browsing history?

Yes. And some shared links to NW which is distribution.

ShortyShorts · 03/02/2025 13:38

Growlybear83 · 03/02/2025 13:35

Thankfully I didn't see the posts concerned, although I saw a thread talking about them when I went to bed at 1. But apart from the obvious outrage about the images that were posted and the language used, am I right in thinking that everyone who opened the threads and saw the pictures concerned will now have images of child sexual abuse in their browsing history?

I don't think so?

They'll have the threads in their browsing history but if the images have been removed, I don't think they'll be there?

Either way, it's easy enough to clear history. It's just a pain in the arse having to log back into things.

HansHolbein · 03/02/2025 13:39

I am a Mod on Reddit that has around 3 million users. We have seen some terrible things over the years.

Unfortunately, you can’t stop users doing these disgusting things but what you can do is have a large number of committed mods around the clock 24hrs a day across many time zones and excellent behind the scenes work.

We have very good filters that catch most of this disgusting stuff before any users ever see it and because there are so many of us, the majority of it is removed very quickly. It is then immediately escalated through the Reddit legal department.

I’m stunned that Mumsnet does not have the same sort of filters and behind the scenes stuff going on like we do. We have karma and new account restrictions that pretty much stop awful things such as this becoming public. You have to have this level of protection otherwise stuff like this happens.

Whilst it is impossible to completely prevent it, it is possible to minimise the chance of regular users seeing such awful material and being traumatised for life. It’s been a very long time since a regular user has seen disgusting material on our sub because of the extremely strict and vigilant measures we have in place.

It’s really not good enough.

towelsandsheets · 03/02/2025 13:40

OwlInTheOak · 03/02/2025 13:36

Surely there could be some sort of AI technology to screen for inappropriate photos. It's definitely possible/available given how FB blurs certain content, and I can't see any reason not to implement it.

Yes and no

Yes you can and should use automated systems on images and language

BUT ( that's a big but )

It's a ongoing war of development- develop AI to pick something up, develop AI to bypass that system

Way way back - automated systems were losing all discussions about Scunthorpe - it's much harder than people realise even for language; images are another level again - what is the minimum number of pixels that need to change in order to bypass a censor program ? The answer can be very low indeed

swallowedAfly · 03/02/2025 13:41

HansHolbein · 03/02/2025 13:39

I am a Mod on Reddit that has around 3 million users. We have seen some terrible things over the years.

Unfortunately, you can’t stop users doing these disgusting things but what you can do is have a large number of committed mods around the clock 24hrs a day across many time zones and excellent behind the scenes work.

We have very good filters that catch most of this disgusting stuff before any users ever see it and because there are so many of us, the majority of it is removed very quickly. It is then immediately escalated through the Reddit legal department.

I’m stunned that Mumsnet does not have the same sort of filters and behind the scenes stuff going on like we do. We have karma and new account restrictions that pretty much stop awful things such as this becoming public. You have to have this level of protection otherwise stuff like this happens.

Whilst it is impossible to completely prevent it, it is possible to minimise the chance of regular users seeing such awful material and being traumatised for life. It’s been a very long time since a regular user has seen disgusting material on our sub because of the extremely strict and vigilant measures we have in place.

It’s really not good enough.

It appears the NW had zero means to escalate and no means of accessing support.

ShortyShorts · 03/02/2025 13:41

I can't remember which forum I was on years ago, but you had to be a member for a certain amount of time before you could post images.

It was to stop people signing up and spamming the boards.

Mind you for every measure put in place, there's something else to counteract it I expect.

WorriedRelative · 03/02/2025 13:45

swallowedAfly · 03/02/2025 13:22

Those saying it could happen on any site need to be aware that that is exactly why every major site should have emergency protocol for when it does happen. Not tell me when I wake up and then we’ll talk to our lawyers.

Exactly and it is bloody obvious that MNs policies and procedures are woefully inadequate.

The fact the nightwatch have no professional back up in an emergency is negligent in the extreme. I hope they sue for the trauma they have been subjected to as it seems money is the only thing motivating MNHQ so hitting their pocket will be the only way to make them change.

Caerulea · 03/02/2025 13:45

HansHolbein · 03/02/2025 13:39

I am a Mod on Reddit that has around 3 million users. We have seen some terrible things over the years.

Unfortunately, you can’t stop users doing these disgusting things but what you can do is have a large number of committed mods around the clock 24hrs a day across many time zones and excellent behind the scenes work.

We have very good filters that catch most of this disgusting stuff before any users ever see it and because there are so many of us, the majority of it is removed very quickly. It is then immediately escalated through the Reddit legal department.

I’m stunned that Mumsnet does not have the same sort of filters and behind the scenes stuff going on like we do. We have karma and new account restrictions that pretty much stop awful things such as this becoming public. You have to have this level of protection otherwise stuff like this happens.

Whilst it is impossible to completely prevent it, it is possible to minimise the chance of regular users seeing such awful material and being traumatised for life. It’s been a very long time since a regular user has seen disgusting material on our sub because of the extremely strict and vigilant measures we have in place.

It’s really not good enough.

I once caught a clip of a school shooting edited to look like it was part of a big gamers stream, it had only just been posted & took a moment for me to realise what I was seeing, it was horrific. Reported it & it was gone within minutes. I also immediately posted a warning to the sub to not watch that particular post & the mods took that down too once the video was gone. It was hella quick & very impressive.

whatwouldyoudoifisangoutofkey · 03/02/2025 13:46

These non comments are actually offensive as a response. You may have contacted the nw and people who had reported at the time( the first report we know of was 11.15pm yesterday) but there are many others who saw those images including some for whom this will have been extremely damaging to their own history or those of their children.

Yet no apology or accountability thus far let alone acknowledgment of the fact users have been warning you about the dangers of your overnight policies for over a decade.
I pretty much agree with this although I'm leaning towards dismissive and patronising rather than offensive.
I particularly dislike the MN remark " as you can imagine we're pretty busy this morning " .
You're not two housewives having coffee round the kitchen table.
You're a large business, yes one or two senior positions may be tied up helping the authorities but I'm pretty sure there's enough of you to draft a more respectful message to your readers and not ask for understanding because you're busy.

DrSpartacular · 03/02/2025 13:48

At the very least, the ability to post images should have been removed until suitable restrictions/filters are put in place.

HansHolbein · 03/02/2025 13:48

@Caerulea I am glad to read that. Some subs are shit and some Reddit mods are arseholes but the majority of us work extremely hard to protect and support users.

Most of the internet is a cesspit sadly. So much anger and hatred, all under the cloak of anonymity.

tribpot · 03/02/2025 13:50

@HansHolbein unfortunately MN is still run as if it was a fledgling website with a couple of hundred users. It's clear from the amount of identical spam that is reported every morning that there are no keyword filters. The Night Watch are left to manage the other time zones with an extremely limited set of tools in return for no pay and somehow this is all acceptable in 2025 for a site with a huge community from across the world.

When these types of incident happen - thinking of Geoffreygate and others - MN are completely out of their depth. Eventually Justine will post and say that lessons will be learnt and we're taking it all very seriously. Then virtually nothing will change. I suspect a major issue is MN's use of a bespoke bulletin board solution, so unless they specified a requirement to be able to disable posting images, or keyword filtering, or restrictions on new posters, none of this functionality exists. So they would then need to pay for it, out of the ad revenue profits, which would otherwise go elsewhere. I suspect that until MN is sued the financial decision makers will not value making the site safer to use over banking the profits for themselves.

Growlybear83 · 03/02/2025 13:50

@ShortyShorts I thought you could clear your normal browsing history but it was difficult to delete things altogether from your hard drive. I thought some people eventually get caught for various crimes when a really deep examination is carried out of the hard drives on their computers.

GreylingsSkin · 03/02/2025 13:50

Are they going to pay for therapy for those of us effected by this?

My heart aches for the the victims in the images. Mumsnet has complete responsibility for this. I’m appalled that Justine a millionaire uses (and I mean ‘uses’ as in she is a user, as this is some cheeky fuckery on acid) volunteers!! It’s not a charity!

This also shows the sites weaknesses, I think it should self report itself to the police for this. Horrendous. So angry.

GreylingsSkin · 03/02/2025 13:51

tribpot · 03/02/2025 13:50

@HansHolbein unfortunately MN is still run as if it was a fledgling website with a couple of hundred users. It's clear from the amount of identical spam that is reported every morning that there are no keyword filters. The Night Watch are left to manage the other time zones with an extremely limited set of tools in return for no pay and somehow this is all acceptable in 2025 for a site with a huge community from across the world.

When these types of incident happen - thinking of Geoffreygate and others - MN are completely out of their depth. Eventually Justine will post and say that lessons will be learnt and we're taking it all very seriously. Then virtually nothing will change. I suspect a major issue is MN's use of a bespoke bulletin board solution, so unless they specified a requirement to be able to disable posting images, or keyword filtering, or restrictions on new posters, none of this functionality exists. So they would then need to pay for it, out of the ad revenue profits, which would otherwise go elsewhere. I suspect that until MN is sued the financial decision makers will not value making the site safer to use over banking the profits for themselves.

Those of us affected should sue.

WinterBones · 03/02/2025 13:54

i think the response here was seriously lacking, and still is.

Another site i use was hit last year with similar filth.
Within 5 minutes the user was banned, the ability to post pictures, and create new accounts was shut down, and the images were removed. 5 minutes later and the moderation team had posted a bulletin apologising, and reassuring the members that it was being dealt with behind the scenes to send to the police.

THAT is how it ought to be done.

ShortyShorts · 03/02/2025 13:56

@Growlybear83, possibly. I don't really know to be honest.

But if anyone's phone/laptop ended up in the hands of the police, it'd be very easy to prove why the images were there.

swallowedAfly · 03/02/2025 13:56

WinterBones · 03/02/2025 13:54

i think the response here was seriously lacking, and still is.

Another site i use was hit last year with similar filth.
Within 5 minutes the user was banned, the ability to post pictures, and create new accounts was shut down, and the images were removed. 5 minutes later and the moderation team had posted a bulletin apologising, and reassuring the members that it was being dealt with behind the scenes to send to the police.

THAT is how it ought to be done.

As opposed to more images being posted over a period hours.

murasaki · 03/02/2025 14:00

Exactly, the poor nightwatch were effectively playing whackamole based on reports.

GreylingsSkin · 03/02/2025 14:00

I think those posters who have receipts of earlier issues with night moderation need to complain to the relevant body and show they are aware their are major weaknesses in the site and that they did not make changes, leading to last nights incident.

The poor victims behind abused again by their images being publicly posted.

Please create an account

To comment on this thread you need to create a Mumsnet account.

This thread is not accepting new messages.