A new report from University of Massachusset's 'Rescue Lab', (due out yesterday, although I can't find the report itself yet - )
https://www.rescue-lab.org/
Highlights that Instagram has a particular problem with CSE images and content:
“Instagram is an onramp to places on the internet where there’s more explicit child sexual abuse,” said UMass Rescue Lab director Brian Levine. The Stanford group also found that CSAM content is "particularly severe" on the site. "The most important platform for these networks of buyers and sellers seems to be Instagram."
https://archive.is/ps6fp
'In many cases, Instagram has permitted users to search for terms that its own algorithms know may be associated with illegal material. In such cases, a pop-up screen for users warned that “These results may contain images of child sexual abuse,” and noted that production and consumption of such material causes “extreme harm” to children. The screen offered two options for users: “Get resources” and “See results anyway.” '
Another article:
https://www.yahoo.com/entertainment/meta-vows-to-take-action-after-report-found-instagrams-algorithm-promoted-pedophilia-content-133343896.html