This crosses my mind a lot when I see feminist issues discussed. It seems to get on an equal footing with men women are encouraged to act more like men rather than celebrate the fact they're women.
It's as though feminity it looked down upon as second class to masculinity and therefore discouraged.
Does anyone get what I mean? I'm not particularly well read on feminism (plus I'm a man) but I'd love it if someone could flesh this out a bit more (or tell me where I'm misinterpreting a lot of feminist views).
Thanks