I always thought that feminism was the fight to give men and women equal chances in life. To permit women to have all the opportunities to do the things that men take for granted. And the right to be safe from male violence and abuse.
It tries to give women equal rights with men. Good. But are there not responsiblities that come with the rights? What is a woman's responsibility. Do we have a responsiblity to confond expectations? To accept the responsiblity of being the main breadwinner sometimes, to sacrifice being at home with the children if it suits the family dynamic, to be the one that goes out in the world and be the active one? To be prepared to change the world for women in little ways, by, for example, walking home alone at night, because each woman who is too scared narrows the world slightly for other women.
And what about men? It's easy to see how they can take responsibility - by treating women with the same respect they grant to other men - but what rights does feminism grant them? The freedom not to be the one who has to take financial responsibility, the one who isn't seen as a freak by staying at home with the children if that suits the family dynamic, by being granted equal respect as a parent with the mother of the children.
How do you see it.
Please. Don't post if all you can say it that feminism is a crock of man-hating shit, because I will be obliged to ignore you.