I AM a Feminist, but by God over the last few decades has it radicalised a lot of people to believe the very worst of men and to see women as victims at every turn. These are the people that are ruining Feminism IMO, and making many young women want to have nothing to do with it, which is a terrible shame.
Do you agree or not? ONE post each please. (otherwise I think it will veer off-topic or derailed) I will make no further posts, just conducting a survey as it were.