Has feminism brought women and men closer together?
Are feminists shit-testing white men to see how much they will give them?
Have feminists been useful idiots for the government to get more women into work?
Do lesbianic feminists have too loud a voice in the movement?
Would you rather be a mother to your children; or, have the state take care of them?
Are the breaking down of gender roles leading to a lot of very confused people?
If there is a societal collapse, will women and men return to more traditional gender roles?