Sometimes when I read feminist theory, it seems to indicate to me that women can do nothing about their "lot" in life. They are apparently just women after all. I don't find this particularly empowering and I don't agree that women can't change things.
Why are we discussing it if we don't intend to change it. What's your view on femininsm and inherent victimisation?