IIRC MN removed the volunteer mods when there was the whole Top Parenting Guru Gets Offended debacle - I think if you have mods it makes you much more responsible for monitoring and reacting to potentially actionable posts (I could be out of date here, or have misunderstood, though).
However, Twitter and Facebook et al need to get smarter at identifying and removing illegal posts. Harassment is illegal. Threats of violence can be illegal (depending on whether they are serious and addressed to a named individual or group).
That's not just about having a report button. It's about actually assessing the reports that come in to see whether they are serious and important and potentially illegal - so that reporting isn't used to silence women (and others) by aggrieved misogynists/racists/etc. It can't just be put in a box labelled 'too difficult' - Twitter isn't beyond the law.
Making it clear in terms of service that misogyny, racism, homophobia, disablism, harassment, threatening behaviour etc. will not be tolerated and you can be chucked off for indulging in it might help.
What would also help is making sure all staff, including senior management, have diversity training. Not just about race/religion either (which the ill-informed often think is 'the issue', ignoring other forms of hatred).