Researchers interviewed young people engaging with and producing radical online content to help create a number of archetypes of teenage boys who might be vulnerable to becoming radicalised. Accounts were set up on TikTok for each archetype, each with specific interests – they might be seeking content on masculinity or loneliness – and researchers then watched more than 1,000 videos that TikTok suggested on its “For You” page over seven days.
The initial suggested content was in line with the stated interests of each archetype, but after five days researchers said the TikTok algorithm was presenting four times as many videos with misogynistic content including objectification, sexual harassment or discrediting women, which increased from 13% of recommended videos to 56%.
Christ!