My feed
Premium

Please
or
to access all these features

Guest posts

Guest Post

Guest post + Q&A: ‘We must hold tech bosses accountable for child safety online’ [Trigger warning - sexual abuse]

31 replies

NicolaDMumsnet · 26/10/2023 14:51

Rhiannon-Faye McDonald

Rhiannon-Faye McDonald is the Head of Advocacy at the Marie Collins Foundation (MCF), a charity that helps support victims of online child sexual abuse. Rhiannon was the victim of online grooming and sexual harm when she was 13 years old. She has used this personal experience to inform the work of MCF, co-ordinating the Lived Experience group and advocating and campaigning to improve outcomes for victims and their families. She has recently been supporting a UK Home Office campaign urging technology companies, such as Meta, not to roll out end-to-end encryption on messaging platforms without robust safety measures to protect children from sexual abuse and exploitation.

Meta has a responsibility to protect children using their platforms. However, it is currently planning to introduce end-to-end encryption as standard across Facebook Messenger and Instagram Direct. 

For the past two years, as part of the #NoPlaceToHide Campaign, I’ve been calling for them to rethink their plans both as the Head of Advocacy for the Marie Collins Foundation and somebody with lived experience of this abuse. This is an issue incredibly close to my heart, and Meta’s plans will directly impact me, which is why I have chosen to speak out about it.

When I was 13 years old, I was groomed online and sexually abused. A child sex offender manipulated me into sending him a topless photo of myself, which he immediately used to blackmail me for more explicit images and for my address. He came to my home the following morning and sexually abused me, taking yet more photos.

It all happened within 24 hours, and yet I am still dealing with the impact even now, 20 years later.

I have no idea what happened to those photos. I know the police found them on his computer and they were used in the criminal justice process, but I don’t know who else has seen them or may see them in the future. I have no control over those images – I can’t ‘un-create’ them, I can’t get them back. They are out in the world for child sex abusers to view and share, and there’s nothing I can do about it.

I sometimes hear people talk about this as though they are “just images”. But I can’t tell you how big an impact this has had and continues to have on me. When I think of people seeing them, I feel like a victim all over again. The thought of abusers getting satisfaction from looking at them and sharing them makes me feel physically sick. When I walk into a room or down the street, I look at people’s faces and can’t help but think: “have you seen images of me being sexually abused when I was 13? Do you recognise me?” 

I remember the moment I learned that we have technological tools and people working very hard to detect and remove child sexual abuse material. I immediately felt a wave of relief knowing that even if my images were shared, they would be found and taken down quickly.

Meta’s plan to introduce end-to-end encryption snatches that relief away from me again. It is utterly devastating.The information that Meta and other tech platforms send to UK law enforcement contributes to over 800 arrests of child sex abusers and safeguards around 1,200 children per month. Meta is the biggest contributor to these reports. In 2022, there were 31.8 million reports of child sexual abuse material made to NCMEC, and of these 21.1 million came from Facebook and 5 million from Instagram.

If Meta continues with their plan, they will be unable to see child sexual abuse happening on their platforms and, therefore, will be unable to report it. This will allow abusers to groom and abuse children without detection, and poses a catastrophic risk to children.

The debate around end-to-end encryption has long been framed as a binary choice that we must make: privacy vs children’s safety. This isn’t true, but I’d like to say that I’m all for strong privacy measures – after all, my privacy is infringed by the fact that the images of my abuse are out there and could be shared. I think this is a key point that has been missed because victims and survivors have not been included in the discussions: we deserve privacy too!

The fact is experts have demonstrated that it would be technically feasible to detect child sexual abuse within an end-to-end encrypted environment whilst maintaining strong user privacy. Meta needs to urgently invest in these technologies to create a solution for their platforms; however it is choosing not to.

It is one of the biggest tech companies in the world, with truly significant resources at its fingertips.They have the power to make positive change in this area, and yet they are choosing to turn a blind eye to the protection of children. We must all tell them that this is not OK, that they must safeguard children, and that they cannot give child sex abusers a place to hide.


^^Rhiannon has partnered with the Home Office and Internet Watch Foundation to create a guide for parents about end-to-end encryption and how to help keep your child safe online. Read the parent guide here.

Rhiannon and the Security Minister Tom Tugendhat will be returning to this thread on Friday 10th October at 3:30pm to answer questions. If you have a question for either of them, please leave it below.

Guest post + Q&A: ‘We must hold tech bosses accountable for child safety online’  [Trigger warning - sexual abuse]
OP posts:
Report
Itsdifficulttodomyjobsometimes · 02/11/2023 12:03

Despite working in a related field I wasn't aware of Meta's plans to bring in end to end encryption, and I have to say I'm horrified by the proposal.
I've thankfully never had to see images but do suffer with secondary trauma as a result of my work. I can't begin to imagine how hard it must be for the victims and their families to cope.

Have Meta given any indication as to how they will protect children and vulnerable people when they bring in end to end encryption? They surely can't pretend that the issue no longer exists.

Report
IsItHotAgainTomorrow · 04/11/2023 00:23

Interesting that you are 'picking on' Facebook. WhatsApp (which is owned by Meta) and iMessage are encrypted end to end already.

Will Cathcart (of Meta/WhatsApp) has said that if the UK bans End-to-End encryption, WhatsApp will not comply, which must effectively mean it disappears from the UK.

I suppose if you don't have a messaging app then people cant abuse it, so maybe you achieve your objective

Report
helpimgoingmadatmydp · 07/11/2023 16:38

Thank you for sharing your experience Rhiannon and your work bringing awareness to this issue.

My question is, is there anything that the government can do to intervene and ensure that companies who use end-to-end encryption have to comply to certain safety measures? It seems so scary that children may be unsafe due to companies using end-to-end encryption. Thank you.

Report
Dialbackonthedigital · 09/11/2023 21:04

I applaud the efforts you are making to keep children safe. However Meta and others could protect children in a heartbeat if they chose to. But big tech companies are making a choice to prioritise profit over child safety. Do you not think, as a society, we should look at this in a different way? If social media and unfettered internet access aren't safe for children then shouldn't we regulate the devices they can access unsafe platforms on. The portable nature of smartphones is a huge problem here. Children stumbling across dangerous, harmful content and being sought out by predators would be significantly reduced if we took steps to regulate the devices themselves which are addictive by design and a gateway to harm for children

Report
chaffinch77 · 10/11/2023 09:14

Is there actually anything the govt can do if Meta choose to go ahead with this? I'm assuming the govt are unlikely to ban huge social media companies or effectively sanction them so are we just relying on them to do the 'right thing'? And if so, where do you draw the line?

Report
whyisitalwaysraining91 · 10/11/2023 10:27

Do the tools to detect child sexual abuse content on encrypted messages without affecting user privacy already exist? Or would each tech organisation (eg. Meta) be responsible for creating their own tools?

Report
DawnAttwood · 10/11/2023 12:01

"Meta is the biggest contributor to these reports. In 2022, there were 31.8 million reports of child sexual abuse material made to NCMEC, and of these 21.1 million came from Facebook and 5 million from Instagram." - this stat is absolutely staggering. If Meta do proceed with these plans and as Rhiannon says these reports will no longer be possible how on earth does the govt plan to tackle that gap in reporting?

Report
limebasilandmandarin · 10/11/2023 12:03

is this not something that should have/could have been included in the Online Safety Bill?

Report
Kenickie23 · 10/11/2023 15:09

Hi Tom, I saw your video on Twitter so this a bit more of a general question about online safety but I do wonder where govt see the line in terms of whose responsibility it is? How do you see the divide in responsibility between parents, government, social media companies?

Report
SarahHasaBlackCat · 10/11/2023 15:14

Hi Rhiannon - thank you so much for being prepared to share your experience so that you can spread awareness of this issue. I wonder if you have had the chance to raise these issues directly with Meta what their response has been? Because the figures you have cited mean the gap this is going to leave in reporting is huge - how can they justify it?

Report
UKSecurityMinister · 10/11/2023 15:30

Hello, I’m Tom Tugendhat, the UK Security Minister. I’m here with Rhiannon to answer your questions about child safety online, and what Meta and others need to be doing to help keep our kids safe. I’m grateful to Mumsnet for facilitating such an important conversation.

I’ll be here for the next hour, answering as many questions as I can, please do share them in the comments.

For more information about end-to-end encryption (E2EE) you can visit: End-to-end encryption and child safety - GOV.UK (www.gov.uk)

End-to-end encryption and child safety

https://www.gov.uk/government/publications/end-to-end-encryption-and-child-safety/end-to-end-encryption-and-child-safety#:~:text=End%2Dto%2Dend%20encryption%20(,online%20purchases%20safe%20and%20secure.

Report
UKSecurityMinister · 10/11/2023 15:41

Itsdifficulttodomyjobsometimes · 02/11/2023 12:03

Despite working in a related field I wasn't aware of Meta's plans to bring in end to end encryption, and I have to say I'm horrified by the proposal.
I've thankfully never had to see images but do suffer with secondary trauma as a result of my work. I can't begin to imagine how hard it must be for the victims and their families to cope.

Have Meta given any indication as to how they will protect children and vulnerable people when they bring in end to end encryption? They surely can't pretend that the issue no longer exists.

@Itsdifficulttodomyjobsometimes I’m really sorry to hear you suffer with trauma from your job. I hope you’re being well supported.

Meta has said that they will put in place safety measures to help mitigate the risk, but they aren’t anywhere near good enough.

To be fair, they’re currently really good at finding and reporting child sexual abuse content because they proactively scan their platforms to find images and videos of abuse, and to detect instances where a child may be being groomed. However, with E2EE they will stop looking for child sexual abuse on Facebook Messenger and Instagram Direct and rely on measures such as age verification, and on children to identify and report that they are being abused. 

That’s not good enough. Unfortunately, some people aren’t honest when it comes to age verification – children pretend to be older and some adults pretend to be younger. And relying on children to report harmful and illegal content makes no sense - the burden shouldn’t be on them. A lot of the time children aren’t even aware they’re being groomed or sexually abused until it’s too late.

I want to be clear – I’m pro-encryption. But it’s wrong to suggest that there’s a binary choice between privacy and security. The world’s best cryptography experts have shown how companies such as Meta can reduce child sexual abuse online while guarding people’s personal privacy. We want Meta to work with us and invest in technologies that will protect user privacy without putting children at greater risk.

Report
RhiannonMCF · 10/11/2023 15:42

Hi I’m Rhiannon-Faye McDonald, I’m here with the UK Security Minister to follow up on my post above and answer any questions you may have about online safety and Meta's plans around end-to-end encryption.

Please feel free to ask any questions in the comments.

Report
UKSecurityMinister · 10/11/2023 15:46

IsItHotAgainTomorrow · 04/11/2023 00:23

Interesting that you are 'picking on' Facebook. WhatsApp (which is owned by Meta) and iMessage are encrypted end to end already.

Will Cathcart (of Meta/WhatsApp) has said that if the UK bans End-to-End encryption, WhatsApp will not comply, which must effectively mean it disappears from the UK.

I suppose if you don't have a messaging app then people cant abuse it, so maybe you achieve your objective

@IsItHotAgainTomorrow Facebook Messenger and Instagram Direct account for 85% of current reports of child sexual abuse online. That’s why we’re calling on Meta (who own both) to change their plans, because we know their platforms are being used by offenders.
 
They’re not our only concern, because we know child sexual abuse is present on other social media platforms too.

I want to make sure that when social media companies roll out end-to-end encryption, they’re doing so with the appropriate safeguards needed to keep children safe.

Report
UKSecurityMinister · 10/11/2023 15:55

There have been a few questions about government intervention and legislation, so I hope this answer will address your points.

Our new Online Safety Act gives Ofcom powers to regulate social media companies. They will be able to direct tech companies to make their best efforts to introduce effective safeguards on their platforms.

If they fail to comply Ofcom can impose fines of up to 10% of the company’s global annual turnover – it is a significant measure for dealing with this problem.

However, we would much rather work collaboratively with companies to introduce safeguards. That’s what our conversations with Meta over the past few months have been about.

Report
RhiannonMCF · 10/11/2023 16:00

SarahHasaBlackCat · 10/11/2023 15:14

Hi Rhiannon - thank you so much for being prepared to share your experience so that you can spread awareness of this issue. I wonder if you have had the chance to raise these issues directly with Meta what their response has been? Because the figures you have cited mean the gap this is going to leave in reporting is huge - how can they justify it?

Thanks @SarahHasaBlackCat I have raised it directly with Meta, I wrote a letter to them last year (you can read it here) and met with their head of safety policy to discuss it. Their response was quite careful to point out the many things they have put in place or will be introducing to try and keep children safe on the platforms when end-to-end encryption is introduced as standard, and essentially said that they don't need to scan for child sexual abuse material to be able to protect children. Those measures are all really useful but they aren't sufficient on their own and need to be complemented with the abilities they already have to find and remove child sexual abuse material. Personally I don't think they can justify it - it will leave children vulnerable to abusers and victims/survivors like myself open to ongoing revictimization while images of our abuse continue to be shared.

No Place to Hide

https://noplacetohide.org.uk/letter/

Report
UKSecurityMinister · 10/11/2023 16:04

Dialbackonthedigital · 09/11/2023 21:04

I applaud the efforts you are making to keep children safe. However Meta and others could protect children in a heartbeat if they chose to. But big tech companies are making a choice to prioritise profit over child safety. Do you not think, as a society, we should look at this in a different way? If social media and unfettered internet access aren't safe for children then shouldn't we regulate the devices they can access unsafe platforms on. The portable nature of smartphones is a huge problem here. Children stumbling across dangerous, harmful content and being sought out by predators would be significantly reduced if we took steps to regulate the devices themselves which are addictive by design and a gateway to harm for children

Edited

@Dialbackonthedigital The devices we all use can be so important for kids. Mine use an iPad to watch David Attenborough and my daughter even does Cosmic Kids Yoga – and loves it. As a government, we don’t want to prevent children from utilising technology in their lives.

But I understand your point. That is why the Government has placed the onus on technology companies to ensure that their services and platforms are safe for children. The new Online Safety Act requires technology companies to put in place strong measures that will ensure children cannot access illegal material and that they cannot be targeted by predators. Ofcom, the regulator, will outline the legal duties that companies must follow, and companies must either obey these duties or show that the steps they are taking equally as effective.

Report
RhiannonMCF · 10/11/2023 16:13

UKSecurityMinister · 10/11/2023 15:41

@Itsdifficulttodomyjobsometimes I’m really sorry to hear you suffer with trauma from your job. I hope you’re being well supported.

Meta has said that they will put in place safety measures to help mitigate the risk, but they aren’t anywhere near good enough.

To be fair, they’re currently really good at finding and reporting child sexual abuse content because they proactively scan their platforms to find images and videos of abuse, and to detect instances where a child may be being groomed. However, with E2EE they will stop looking for child sexual abuse on Facebook Messenger and Instagram Direct and rely on measures such as age verification, and on children to identify and report that they are being abused. 

That’s not good enough. Unfortunately, some people aren’t honest when it comes to age verification – children pretend to be older and some adults pretend to be younger. And relying on children to report harmful and illegal content makes no sense - the burden shouldn’t be on them. A lot of the time children aren’t even aware they’re being groomed or sexually abused until it’s too late.

I want to be clear – I’m pro-encryption. But it’s wrong to suggest that there’s a binary choice between privacy and security. The world’s best cryptography experts have shown how companies such as Meta can reduce child sexual abuse online while guarding people’s personal privacy. We want Meta to work with us and invest in technologies that will protect user privacy without putting children at greater risk.

@Itsdifficulttodomyjobsometimes I'm echoing @UKSecurityMinister here, I'm sorry that you have to deal with secondary trauma and hope there is support available to you.

I just wanted to pick up on your point about how hard it must be for victims and families to cope because it's such an important point on this topic. It is incredibly difficult and requires specialist support to help victims on their recovery journey.

The impact of child sexual abuse is awful enough, but the fact that there are images and/or videos of that abuse which can be shared online indefinitely, is just horrendous. The permanency and lack of control over them can be almost unbearable, and unfortunately it's something we have to live with every day.

What we, as victims and survivors, need is for tech companies to be doing everything they can to find and remove those images to prevent us from being retraumatised and revictimised each time they are shared. Which is why Meta's plans feel like such a huge step backwards as they will lose the ability to do this.

Report
UKSecurityMinister · 10/11/2023 16:14

whyisitalwaysraining91 · 10/11/2023 10:27

Do the tools to detect child sexual abuse content on encrypted messages without affecting user privacy already exist? Or would each tech organisation (eg. Meta) be responsible for creating their own tools?

@whyisitalwaysraining91 Yes, they do. It is possible to keep our children safe online and protect users’ privacy. To show this, the Government set up a £425,000 fund to support the development of potential solutions that could be used to keep children safe online. The fund and solutions developed have already demonstrated that it is technically feasible to protect children whilst maintaining user privacy.
 
Every platform is different, so it is up to companies like Meta who have some of the brightest minds in the world and the greatest resources to build on what we have started to develop a solution that works best for their platform. 

Report
AndApplePie · 10/11/2023 16:16

What do social media companies and the government actually do with the all reports Meta's currently providing? Will it actually make that much of a difference?

Report
UKSecurityMinister · 10/11/2023 16:20

DawnAttwood · 10/11/2023 12:01

"Meta is the biggest contributor to these reports. In 2022, there were 31.8 million reports of child sexual abuse material made to NCMEC, and of these 21.1 million came from Facebook and 5 million from Instagram." - this stat is absolutely staggering. If Meta do proceed with these plans and as Rhiannon says these reports will no longer be possible how on earth does the govt plan to tackle that gap in reporting?

@DawnAttwood You’re right. This is staggering; the actions Meta are about to take will put millions of children in the UK at greater risk. They would leave so many around the world more vulnerable.
 
I’ve just returned from a trip to the US where I met with Meta senior leaders to discuss child safety. I’m sure we can work with Meta and other companies to find a solution that keeps our children safe but that means we all need to play our part.
 
The Government has introduced new laws under the Online Safety Act which will include a requirement to report instances of child sexual abuse to the National Crime Agency and others. We’re also giving more resources to the National Crime Agency, our top crime fighters, helping them increase their networks of undercover officers, and improve the response of police forces up and down the country to make sure that they can continue to arrest the worst offenders.    

Report
HuskyOwner89 · 10/11/2023 16:22

if things are as bad as you say should we be allowing our children to use social media platforms at all

Report
RhiannonMCF · 10/11/2023 16:24

AndApplePie · 10/11/2023 16:16

What do social media companies and the government actually do with the all reports Meta's currently providing? Will it actually make that much of a difference?

@AndApplePie It will make a significant difference. The information that social media companies give to UK law enforcement contributes to over 800 arrests of suspected child sexual abusers and results in an estimated 1,200 children being safeguarded from child sexual abuse every month.   

Report
UKSecurityMinister · 10/11/2023 16:27

Kenickie23 · 10/11/2023 15:09

Hi Tom, I saw your video on Twitter so this a bit more of a general question about online safety but I do wonder where govt see the line in terms of whose responsibility it is? How do you see the divide in responsibility between parents, government, social media companies?

@Kenickie23 Let’s be clear – any crime is the responsibility of the criminal. Any offender is responsible for their own actions and no excuses are acceptable. But we all know that we all have the responsibility to make our spaces safe.
 
It is up to social media companies to do what they can to make their platforms safe, and up to us in the government to let you know what your kids could be dealing with on each platform. I’m not going to tell you how to raise your kids or what is appropriate for you but I will warn when I know there is a risk.
 
Making social media platforms safe is down to companies like Meta who have vast power and influence over our lives. It’s a bit Spiderman, but with great power comes great responsibility. It’s not ok for tech executives to take profits and pass the buck. If they’re hosting kids then they should know that they’re hosting them as safely as possible.

We’re encouraging tech companies to do what they know they need to do and we’re offering all the help we can help to solve any problems they may face. They're the experts on their platforms to, ultimately, it’s up to them to take the actions needed.

Report
UKSecurityMinister · 10/11/2023 16:29

RhiannonMCF · 10/11/2023 16:00

Thanks @SarahHasaBlackCat I have raised it directly with Meta, I wrote a letter to them last year (you can read it here) and met with their head of safety policy to discuss it. Their response was quite careful to point out the many things they have put in place or will be introducing to try and keep children safe on the platforms when end-to-end encryption is introduced as standard, and essentially said that they don't need to scan for child sexual abuse material to be able to protect children. Those measures are all really useful but they aren't sufficient on their own and need to be complemented with the abilities they already have to find and remove child sexual abuse material. Personally I don't think they can justify it - it will leave children vulnerable to abusers and victims/survivors like myself open to ongoing revictimization while images of our abuse continue to be shared.

@SarahHasaBlackCat, @RhiannonMCF has been courageous enough to share her experience and I can’t thank her enough for coming forward and speaking out. I have been talking to senior leaders in Meta – I was there only a few weeks ago – to make sure they understand we’re serious about this and will be reminding them, and others, of their responsibilities to users. 

Report
Please create an account

To comment on this thread you need to create a Mumsnet account.