Forgive me, a bit of a GF title I admit! But please bear with me - and excuse the crude explanation and hopefully I'll make some sense!
As I'm sure many here may agree, arguably, a lot of the problems of the world seem to originate from straight-white-men in power.
When in power, they seem to do things like throw their weight around, start wars, invent, exploit and develop capitalism, plunder our natural resources and let their old boys networks get in the way of progress - this - if you have time to read it, is an incredible example of how the influence of three men and their mutual backslapping have led to the perversion of scientific truth and a global health disaster affecting millions.
It seems to me that if women had the power, things could be much better!
But even if things were better with women in charge at first, would it stay like that as we got used to being dominant?
Has it maybe benefited women socially to learn empathy, tolerance of others, negotiation etc partly because we've had less access to power? (Rather than us having less power as a result of these attributes).
Does power always corrupt people? If women were the default gender in power, over time would we start becoming like the men in power now? (Animal Farm popping into my head now!)
Is the underlying issue our systems of power and their corrupting force? If we do ever manage to smash the patriarchy, might a new power elite develop based on some other attribute, and be just as bad? Because if we're aiming for equality of genders i power that's great, but if it's power that corrupts then they'll all end up just as corrupt! Is there a cooperative model that tackles this?
What comes after the patriachy? (Is this similar to what comes after capitalism?)
If anyone's still with me, thank you! I'd love to know what you think.
(Forgive me if this is old hat, it's new to me!)