NSPCC calls on social media companies to send 'anti-grooming alerts'

NSPCC social media

Children's charity the NSPCC is urging social media companies to develop 'alerts' to warn young users of potential grooming behaviour when they're talking online.

The NSPCC has said that social media companies such as Facebook and Instagram should use existing moderating technology to detect language commonly used by child groomers and send warning alerts to children at risk.

Although most social media platforms already use algorithms to flag up images of child abuse, hate speech and extremist material, moderation of grooming behaviour and language is far less widely implemented. The children's charity is asking for detection techniques to be used to pick up typical 'grooming language' systematically and let users know when they may be at risk.

Although there is an existing voluntary code of practice in place for social media companies to keep their young users safe, the NSPCC is calling for a mandatory code to be put in place as part of the government's internet safety strategy.

The NSPCC's call for mandatory grooming moderation for social networks comes less than a year after their successful campaign to make sexual communication with a child illegal. Before this law, introduced in April 2017, police could not intervene until groomers attempted to meet their targets face-to-face.

The charity has highlighted the need for key social networks to take action to help enforce the law – in particular Facebook, Instagram and Snapchat, which were the platforms chosen by child groomers in 63% of all 1,316 online grooming incidents over a six month period last year.

Tony Stower, Head of Child Safety Online at the NSPCC, said that despite the “staggering number of offences”, government and social networks are not doing enough to stop the growing problem of online child grooming.

“The government's internet safety strategy must require social networks to build in technology to keep their young users safe, rather than relying on police to step in once harm has already been done,” he said.

In response, Minister for Digital, Culture, Media and Sport Matt Hancock said that he would be firm with social media companies. He told BBC Breakfast that as a father of three young children it was something that “really mattered” to him.